Reusing content owned by the client on websites for other locations?
-
Hello All!
Newbie here, so I'm working through some of my questions I do have two major question regarding duplicate content:
_Say a medical hospital has 4 locations, and chooses to create 4 separate websites. Each website would have the same design, but different NAP, and contact info, etc. Essentially, we'd be looking at creating their own branded template. _
My question 1.) If the hospitals all offer similar services, with roughly the same nav, does it make sense to have multiple websites? I figure this makes the most sense in terms of optimizing for their differing locations.
2.) If the hospital owns the content on the first site, I'm assuming it is still necessary to change it duplicates for the other properties? Or is it possible to differentiate between the duplication of owned content from other instances of content duplication?
Everyone has been fantastic here so far, looking forward to some feedback!
-
I agree with both Andrea and Miriam in that the best-case scenario would be one site that provides links and information to different locations, provided the branding and business model support that of course.
-
You're welcome Tyler. I think Andrea has a good suggestion below too.
-
Hi Tyler,
Does the hospital have one name or four? In other words, is the whole hospital chain called St. Joseph's Hospital, or is one St. Joseph's Urgent Care, while another is Goldman-St. Joseph's and another is St. Joseph's Memorial, etc. ? If only one, and all four hospitals are administered by the same ruling body, then I would almost always suggest creating just one website if this were my Local SEO/design client.
With this approach, each of the hospital branches can be given a location landing page with unique content on it (most importantly, the unique complete contact information for each branch) and these pages will not duplicate one another in any way. Then, all the rest of the site content goes to the good of the overall brand, and there is no problem with duplication because each page is occurring only once rather than possibly occurring 4 times on 4 different websites.
Also, by making one site the official source of info for the brand, you reduce the risk of Google+ merges/dupes.
If, for some reason, the governing body insists on having 4 different websites instead of 1, then, yes, you must be sure that the content is unique on each website to avoid duplicate content.
-
I'd actually go a different route and do one site with separate pages for each location. It'dbe better for the overall issue of content/avoiding duplicate which can become a huge issue. Four sites is a lot more to manage and track and keep up and running.
But it depends on what the online strategy is, too. Google is constantly working to get localized results, too, so it's not as though there has to be four totally independent sites to get results targeted to a certain neighborhood.
I'm not saying this is the best site ever, but one hospital network that comes to mind, Beaumont, has theirs set up this way:Â http://www.beaumont.edu/
-
Hi Dana,
Thanks for the reply! You are correct in that I'm not directly employed by these clients. I'm guessing our best option would just be using a similar base, and reworking the content enough as to differentiate.
The difficult part comes when all the locations will offer the same services. We'll just be tasked with coming up with new ways to say the same things. Home/About Us wouldn't be tricky because that makes unique content creation a bit easier.
Thank you for your quick reply!
-
Hi Tyler,
Here's a partial answer. I am not a specialist in local SEO so you might get some more detailed ideas if some of those folks chime in on this one.
It seems to me you only have two options. One is to create unique content for each hospital. The other, doing as you suggest an using content created oin one site and re-using it on another is only going to work if you use canonicalization properly. The downside for that is the site with the canonical tag is going to get credit for that content and the site without the canonical tag isn't. You could be fragmenting good content in ways you may never have envisioned when you began. The result could be that one hospital site does way better than another in the SERPs.
I would encourage your clients (I am assuming these hospitals are clients and that you aren't directly employed by them), encourage them to express to you what is special about each of those hospitals. What differentiates them from other hospitals in the same area, and perhaps even from each other. This is the harder, but better route I think.
I hope that helps a little!
Dana
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Website crawl error
Hi all, When I try to crawl a website, I got next error message: "java.lang.IllegalArgumentException: Illegal cookie name" For the moment, I found next explanation: The errors indicate that one of the web servers within the same cookie domain as the server is setting a cookie for your domain with the name "path", as well as another cookie with the name "domain" Does anyone has experience with this problem, knows what it means and knows how to solve it? Thanks in advance! Jens
Technical SEO | | WeAreDigital_BE0 -
Is the content on my website is garbage?
I received a mail from google webmasters, that my website is having low quality content. Website - nowwhatmoments.com
Technical SEO | | Green.landon0 -
Website content has been scraped - recommended action
So whilst searching for link opportunities, I found a website that has scraped content from one of our websites. The website looks pretty low quality and doesn't link back. What would be the recommended course of action? Email them and ask for a link back. I've got a feeling this might not be the best idea. The website does not have much authority (yet) and a link might look a bit dodgy considering the duplicate content Ask them to remove the content. It is duplicate content and could hurt our website. Do nothing. I don't think our website will get penalised for it since it was here first and is in the better quality website. Possibly report them to google for scraping? What do you guys think?
Technical SEO | | maxweb0 -
Google not indexing my website
Hi guys, We have this website http://www.m-health-expo.nl/ but it is not indexed by google. In webmaster tools google says that it can not fetch the site due to the robots.txt but i do not see any faults in it. http://www.m-health-expo.nl/robots.txt Do you see something strange, it really bothers me.
Technical SEO | | RuudHeijnen0 -
Duplicate Content Issues
We have some "?src=" tag in some URL's which are treated as duplicate content in the crawl diagnostics errors? For example, xyz.com?src=abc and xyz.com?src=def are considered to be duplicate content url's. My objective is to make my campaign free of these crawl errors. First of all i would like to know why these url's are considered to have duplicate content. And what's the best solution to get rid of this?
Technical SEO | | RodrigoVaca0 -
Development Website Duplicate Content Issue
Hi, We launched a client's website around 7th January 2013 (http://rollerbannerscheap.co.uk), we originally constructed the website on a development domain (http://dev.rollerbannerscheap.co.uk) which was active for around 6-8 months (the dev site was unblocked from search engines for the first 3-4 months, but then blocked again) before we migrated dev --> live. In late Jan 2013 changed the robots.txt file to allow search engines to index the website. A week later I accidentally logged into the DEV website and also changed the robots.txt file to allow the search engines to index it. This obviously caused a duplicate content issue as both sites were identical. I realised what I had done a couple of days later and blocked the dev site from the search engines with the robots.txt file. Most of the pages from the dev site had been de-indexed from Google apart from 3, the home page (dev.rollerbannerscheap.co.uk, and two blog pages). The live site has 184 pages indexed in Google. So I thought the last 3 dev pages would disappear after a few weeks. I checked back late February and the 3 dev site pages were still indexed in Google. I decided to 301 redirect the dev site to the live site to tell Google to rank the live site and to ignore the dev site content. I also checked the robots.txt file on the dev site and this was blocking search engines too. But still the dev site is being found in Google wherever the live site should be found. When I do find the dev site in Google it displays this; Roller Banners Cheap » admin <cite>dev.rollerbannerscheap.co.uk/</cite><a id="srsl_0" class="pplsrsla" tabindex="0" data-ved="0CEQQ5hkwAA" data-url="http://dev.rollerbannerscheap.co.uk/" data-title="Roller Banners Cheap » admin" data-sli="srsl_0" data-ci="srslc_0" data-vli="srslcl_0" data-slg="webres"></a>A description for this result is not available because of this site's robots.txt – learn more.This is really affecting our clients SEO plan and we can't seem to remove the dev site or rank the live site in Google.Please can anyone help?
Technical SEO | | SO_UK0 -
RSS Feed - Dupe Content?
OK so yesterday a website agreed to publish my RSS feed and I just wanted to check something. The site in question is far more established than mine and I am worrying that with my content appearing on their website pretty much at the same time as mine, will Google index theirs first and therefore consider mine to be dupe? They are linking back to each of my articles with the text "original post" and I'm not sure whether this will help. Thanks in advance for any responses!
Technical SEO | | marcoose810 -
Website hacked
Hi I've been asked to help a colleague with his website. It seems to be hacked. He recently received an e-mail from Google saying his adwords account was suspended 'due to high probability his site may be hosting or distributing malicious software' I just checked his source and there seems to loads of weird on code on his pages, this would not have been but on by any members of the website owners. Please image attached when we try to access his website via google search I just contacted the hosting provider - does anyone have experience with this and how to prevent such hacking in the future. The site is build using HTML with no CMS. IjW19.jpg
Technical SEO | | Socialdude0