Reusing content owned by the client on websites for other locations?
-
Hello All!
Newbie here, so I'm working through some of my questions I do have two major question regarding duplicate content:
_Say a medical hospital has 4 locations, and chooses to create 4 separate websites. Each website would have the same design, but different NAP, and contact info, etc. Essentially, we'd be looking at creating their own branded template. _
My question 1.) If the hospitals all offer similar services, with roughly the same nav, does it make sense to have multiple websites? I figure this makes the most sense in terms of optimizing for their differing locations.
2.) If the hospital owns the content on the first site, I'm assuming it is still necessary to change it duplicates for the other properties? Or is it possible to differentiate between the duplication of owned content from other instances of content duplication?
Everyone has been fantastic here so far, looking forward to some feedback!
-
I agree with both Andrea and Miriam in that the best-case scenario would be one site that provides links and information to different locations, provided the branding and business model support that of course.
-
You're welcome Tyler. I think Andrea has a good suggestion below too.
-
Hi Tyler,
Does the hospital have one name or four? In other words, is the whole hospital chain called St. Joseph's Hospital, or is one St. Joseph's Urgent Care, while another is Goldman-St. Joseph's and another is St. Joseph's Memorial, etc. ? If only one, and all four hospitals are administered by the same ruling body, then I would almost always suggest creating just one website if this were my Local SEO/design client.
With this approach, each of the hospital branches can be given a location landing page with unique content on it (most importantly, the unique complete contact information for each branch) and these pages will not duplicate one another in any way. Then, all the rest of the site content goes to the good of the overall brand, and there is no problem with duplication because each page is occurring only once rather than possibly occurring 4 times on 4 different websites.
Also, by making one site the official source of info for the brand, you reduce the risk of Google+ merges/dupes.
If, for some reason, the governing body insists on having 4 different websites instead of 1, then, yes, you must be sure that the content is unique on each website to avoid duplicate content.
-
I'd actually go a different route and do one site with separate pages for each location. It'dbe better for the overall issue of content/avoiding duplicate which can become a huge issue. Four sites is a lot more to manage and track and keep up and running.
But it depends on what the online strategy is, too. Google is constantly working to get localized results, too, so it's not as though there has to be four totally independent sites to get results targeted to a certain neighborhood.
I'm not saying this is the best site ever, but one hospital network that comes to mind, Beaumont, has theirs set up this way: http://www.beaumont.edu/
-
Hi Dana,
Thanks for the reply! You are correct in that I'm not directly employed by these clients. I'm guessing our best option would just be using a similar base, and reworking the content enough as to differentiate.
The difficult part comes when all the locations will offer the same services. We'll just be tasked with coming up with new ways to say the same things. Home/About Us wouldn't be tricky because that makes unique content creation a bit easier.
Thank you for your quick reply!
-
Hi Tyler,
Here's a partial answer. I am not a specialist in local SEO so you might get some more detailed ideas if some of those folks chime in on this one.
It seems to me you only have two options. One is to create unique content for each hospital. The other, doing as you suggest an using content created oin one site and re-using it on another is only going to work if you use canonicalization properly. The downside for that is the site with the canonical tag is going to get credit for that content and the site without the canonical tag isn't. You could be fragmenting good content in ways you may never have envisioned when you began. The result could be that one hospital site does way better than another in the SERPs.
I would encourage your clients (I am assuming these hospitals are clients and that you aren't directly employed by them), encourage them to express to you what is special about each of those hospitals. What differentiates them from other hospitals in the same area, and perhaps even from each other. This is the harder, but better route I think.
I hope that helps a little!
Dana
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate content question
Hey Mozzers! I received a duplicate content notice from my Cycle7 Communications campaign today. I understand the concept of duplicate content, but none of the suggested fixes quite seems to fit. I have four pages with HubSpot forms embedded in them. (Only two of these pages have showed up so far in my campaign.) Each page contains a title (Content Marketing Consultation, Copywriting Consultation, etc), plus an embedded HubSpot form. The forms are all outwardly identical, but I use a separate form for each service that I offer. I’m not sure how to respond to this crawl issue: Using a 301 redirect doesn’t seem right, because each page/form combo is independent and serves a separate purpose. Using a rel=canonical link doesn’t seem right for the same reason that a 301 redirect doesn’t seem right. Using the Google Search Console URL Parameters tool is clearly contraindicated by Google’s documentation (I don’t have enough pages on my site). Is a meta robots noindex the best way to deal with duplicate content in this case? Thanks in advance for your help. AK
Technical SEO | | AndyKubrin0 -
US and UK Websites of Same Business with Same Content
Hello Community, I need your help to understand, whether I can use the US website's content on my UK website or not? US Website's domain: https://www.fortresssecuritystore.com UK Website's domain: https://www.fortresssecuritystore.co.uk Both websites are having same content on all the pages, including testimonials/reviews. I am trying to gain business from Adwords and Organic SEO marketing. Thanks.
Technical SEO | | CommercePundit1 -
Page Content
Our site is a home to home moving listing portal. Consumers who wants to move his home fills a form so that moving companies can cote prices. We were generating listing page URL’s by using the title submitted by customer. Unfortunately we have understood by now that many customers have entered exactly same title for their listings which has caused us having hundreds of similar page title. We have corrected all the pages which had similar meta tag and duplicate page title tags. We have also inserted controls to our software to prevent generating duplicate page title tags or meta tags. But also the page content quality not very good because page content added by customer.(example: http://www.enakliyat.com.tr/detaylar/evden-eve--6001) What should I do. Please help me.
Technical SEO | | iskq0 -
Content and url duplication?
One of the campaign tools flags one of my clients sites as having lots of duplicates. This is true in the sense the content is sort of boiler plate but with the different countries wording changed. The is same with the urls but they are different in the sense a couple of words have changed in the url`s. So its not the case of a cms or server issue as this seomoz advises. It doesnt need 301`s! Thing is in the niche, freight, transport operators, shipping, I can see many other sites doing the same thing and those sites have lots of similar pages ranking very well. In fact one site has over 300 keywords ranked on page 1-2, but it is a large site with an 12yo domain, which clearly helps. Of course having every page content unique is important, however, i suppose it is better than copy n paste from other sites. So its unique in that sense. Im hoping to convince the site owner to change the content over time for every country. A long process. My biggest problem for understanding duplication issues is that every tabloid or broadsheet media website would be canned from google as quite often they scrape Reuters or re-publish standard press releases on their sites as newsworthy content. So i have great doubt that there is a penalty for it. You only have to look and you can see media sites duplication everywhere, everyday, but they get ranked. I just think that google dont rank the worst cases of spammy duplication. They still index though I notice. So considering the business niche has very much the same content layout replicated content, which rank well, is this duplicate flag such a great worry? Many businesses sell the same service to many locations and its virtually impossible to re write the services in a dozen or so different ways.
Technical SEO | | xtopher660 -
Links to Website Author
I'm a website developer, and in the past I have usually added a tiny backlink to the footer of my clients' websites like this: Website Design by MyCompanyName I understand that Google sees this as a low-quality backlink. However, I was wondering if such links can hurt my rankings. Does Penguin sees these links as spam? If so, should I add a rel="nofollow" to the links? Is there anything else I should change? I do not want to remove these links completely because they are good for marketing my business. I just want to minimize any negative SEO impact of the links. I appreciate your input. Thanks.
Technical SEO | | SiteWizard_LLC0 -
Duplicate Content - Mobile Site
We think that a mobile version of our site is causing a duplicate content issue; what's the best way to stop the mobile version being indexed. Basically the site forwards mobile users to "/mobile" which is just a mobile optimised version of the original site. Is it best to block the /mobile folder from being crawled?
Technical SEO | | nsmith7870 -
Tags and Duplicate Content
Just wondering - for a lot of our sites we use tags as a way of re-grouping articles / news / blogs so all of the info on say 'government grants' can be found on one page. These /tag pages often come up with duplicate content errors, is it a big issue, how can we minimnise that?
Technical SEO | | salemtas0 -
404-like content
A site that I look after is having lots of soft 404 responses for pages that are not 404 at all but unique content pages. the following page is an example: http://www.professionalindemnitynow.com/medical-malpractice-insurance-clinics This page returns a 200 response code, has unique content, but is not getting indexed. Any ideas? To add further information that may well impact your answer, let me explain how this "classic ASP" website performs the SEO Friendly url mapping: All pages within the custom CMS have a unique ID which are referenced with an ?intID=xx parameter. The custom 404.asp file receives a request, looks up the ID to find matching content in the CMS, and then server.transfers the visitor to the correct page. Like I said, the response codes are setup correctly, as far as Firebug can tell me. any thoughts would be most appreciated.
Technical SEO | | eseyo20