Reusing content owned by the client on websites for other locations?
-
Hello All!
Newbie here, so I'm working through some of my questions I do have two major question regarding duplicate content:
_Say a medical hospital has 4 locations, and chooses to create 4 separate websites. Each website would have the same design, but different NAP, and contact info, etc. Essentially, we'd be looking at creating their own branded template. _
My question 1.) If the hospitals all offer similar services, with roughly the same nav, does it make sense to have multiple websites? I figure this makes the most sense in terms of optimizing for their differing locations.
2.) If the hospital owns the content on the first site, I'm assuming it is still necessary to change it duplicates for the other properties? Or is it possible to differentiate between the duplication of owned content from other instances of content duplication?
Everyone has been fantastic here so far, looking forward to some feedback!
-
I agree with both Andrea and Miriam in that the best-case scenario would be one site that provides links and information to different locations, provided the branding and business model support that of course.
-
You're welcome Tyler. I think Andrea has a good suggestion below too.
-
Hi Tyler,
Does the hospital have one name or four? In other words, is the whole hospital chain called St. Joseph's Hospital, or is one St. Joseph's Urgent Care, while another is Goldman-St. Joseph's and another is St. Joseph's Memorial, etc. ? If only one, and all four hospitals are administered by the same ruling body, then I would almost always suggest creating just one website if this were my Local SEO/design client.
With this approach, each of the hospital branches can be given a location landing page with unique content on it (most importantly, the unique complete contact information for each branch) and these pages will not duplicate one another in any way. Then, all the rest of the site content goes to the good of the overall brand, and there is no problem with duplication because each page is occurring only once rather than possibly occurring 4 times on 4 different websites.
Also, by making one site the official source of info for the brand, you reduce the risk of Google+ merges/dupes.
If, for some reason, the governing body insists on having 4 different websites instead of 1, then, yes, you must be sure that the content is unique on each website to avoid duplicate content.
-
I'd actually go a different route and do one site with separate pages for each location. It'dbe better for the overall issue of content/avoiding duplicate which can become a huge issue. Four sites is a lot more to manage and track and keep up and running.
But it depends on what the online strategy is, too. Google is constantly working to get localized results, too, so it's not as though there has to be four totally independent sites to get results targeted to a certain neighborhood.
I'm not saying this is the best site ever, but one hospital network that comes to mind, Beaumont, has theirs set up this way: http://www.beaumont.edu/
-
Hi Dana,
Thanks for the reply! You are correct in that I'm not directly employed by these clients. I'm guessing our best option would just be using a similar base, and reworking the content enough as to differentiate.
The difficult part comes when all the locations will offer the same services. We'll just be tasked with coming up with new ways to say the same things. Home/About Us wouldn't be tricky because that makes unique content creation a bit easier.
Thank you for your quick reply!
-
Hi Tyler,
Here's a partial answer. I am not a specialist in local SEO so you might get some more detailed ideas if some of those folks chime in on this one.
It seems to me you only have two options. One is to create unique content for each hospital. The other, doing as you suggest an using content created oin one site and re-using it on another is only going to work if you use canonicalization properly. The downside for that is the site with the canonical tag is going to get credit for that content and the site without the canonical tag isn't. You could be fragmenting good content in ways you may never have envisioned when you began. The result could be that one hospital site does way better than another in the SERPs.
I would encourage your clients (I am assuming these hospitals are clients and that you aren't directly employed by them), encourage them to express to you what is special about each of those hospitals. What differentiates them from other hospitals in the same area, and perhaps even from each other. This is the harder, but better route I think.
I hope that helps a little!
Dana
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Our original content is being outranked on search engines by smaller sites republishing our content.
We a media site, www.hope1032.com.au that publishes daily content on the WordPress platform using the Yoast SEO plugin. We allow smaller media sites to republish some of our content with canonical field using our URL. We have discovered some of our content is now ranking below Or not visible on some search engines when searching for the article heading. Any thoughts as to why? Have we got an SEO proble? An interesting point is the small amount of content we have republished is not ranking against the original author on search engines.
Technical SEO | | Hope-Media0 -
Do you think my client is being hit for duplicate content?
Wordpress website. The client's website is http://www.denenapoints.com/ The URL that we purchase so that we could setup the hosting account is http://houston-injury-lawyers.com, which shows 1 page indexed in Google when I search for site:http://houston-injury-lawyers.com On http://www.denenapoints.com/ there is <link rel="<a class="attribute-value">canonical</a>" href="http://houston-injury-lawyers.com/"> But on http://houston-injury-lawyers.com it says the same thing, <link rel="<a class="attribute-value">canonical</a>" href="http://houston-injury-lawyers.com/" /> Is this how it should be setup, assuming that we want everything to point to http://denenapoints.com/? Maybe we should do a 301 redirect to be 100% Sure? Hopefully I explained this well enough. Please let me know if anyone has any thoughts, thanks!
Technical SEO | | georgetsn0 -
How to create unique content for businesses with multiple locations?
I have a client that owns one franchise location of a franchise company with multiple locations. They have one large site with each location owning it's own page on the site, which I feel is the best route. The problem is that each location page has basically duplicate content on each page resulting in like 80 pages of duplicate content. I'm looking for advice on how to create unique content for each location page? What types of information can we write about to make each page unique, because you can only twist sentences and content around so much before it just all sounds cookie cutter and therefore offering little value.
Technical SEO | | RonMedlin0 -
Location Based Content / Googlebot
Our website has local content specialized to specific cities and states. The url structure of this content is as follows: www.root.com/seattle www.root.com/washington When a user comes to a page, we are auto-detecting their IP and sending them directly to the relevant location based page - much the way that Yelp does. Unfortunately, what appears to be occurring is that Google comes in to our site from one of its data centers such as San Jose and is being routed to the San Jose page. When a user does a search for relevant keywords, in the SERPS they are being sent to the location pages that it appears that bots are coming in from. If we turn off the auto geo, we think that Google might crawl our site better, but users would then be show less relevant content on landing. What's the win/win situation here? Also - we also appear to have some odd location/destination pages ranking high in the SERPS. In other words, locations that don't appear to be from one of Google's data center. No idea why this might be happening. Suggestions?
Technical SEO | | Allstar0 -
Duplicate content
I have two page, where the second makes a duplicate content from the first Example:www.mysite.com/mypagewww.mysite.com/mysecondpageIf i insert still making duplicate content?Best regards,Wendel
Technical SEO | | peopleinteractive0 -
How to get rid of duplicate content
I have duplicate content that looks like http://deceptionbytes.com/component/mailto/?tmpl=component&link=932fea0640143bf08fe157d3570792a56dcc1284 - however I have 50 of these all with different numbers on the end. Does this affect the search engine optimization and how can I disallow this in my robots.txt file?
Technical SEO | | Mishelm1 -
I have a ton of "duplicated content", "duplicated titles" in my website, solutions?
hi and thanks in advance, I have a Jomsocial site with 1000 users it is highly customized and as a result of the customization we did some of the pages have 5 or more different types of URLS pointing to the same page. Google has indexed 16.000 links already and the cowling report show a lot of duplicated content. this links are important for some of the functionality and are dynamically created and will continue growing, my developers offered my to create rules in robots file so a big part of this links don't get indexed but Google webmaster tools post says the following: "Google no longer recommends blocking crawler access to duplicate content on your website, whether with a robots.txt file or other methods. If search engines can't crawl pages with duplicate content, they can't automatically detect that these URLs point to the same content and will therefore effectively have to treat them as separate, unique pages. A better solution is to allow search engines to crawl these URLs, but mark them as duplicates by using the rel="canonical" link element, the URL parameter handling tool, or 301 redirects. In cases where duplicate content leads to us crawling too much of your website, you can also adjust the crawl rate setting in Webmaster Tools." here is an example of the links: | | http://anxietysocialnet.com/profile/edit-profile/salocharly http://anxietysocialnet.com/salocharly/profile http://anxietysocialnet.com/profile/preferences/salocharly http://anxietysocialnet.com/profile/salocharly http://anxietysocialnet.com/profile/privacy/salocharly http://anxietysocialnet.com/profile/edit-details/salocharly http://anxietysocialnet.com/profile/change-profile-picture/salocharly | | so the question is, is this really that bad?? what are my options? it is really a good solution to set rules in robots so big chunks of the site don't get indexed? is there any other way i can resolve this? Thanks again! Salo
Technical SEO | | Salocharly0 -
Does google recognize original content when affiliates use xml-feeds of this content
Hi, Concerning the upcoming (We're from the Netherlands) Panda release: -Could the fact that our affiliates use XML-feeds of our content effect our rankings in some way -Is it possible to indicate to google that content is yours? Kind regards, Dennis Overbeek dennis@acsi.eu | ACSI publishing | www.suncamp.nl | www.eurocampings.eu
Technical SEO | | SEO_ACSI0