Intentional Duplicate Content - Great UX, Bad for Ranking?
-
I'll try to keep this as clear and high level as possible. Thank you in advance for any and all help!
We're managing a healthcare practice which specializes in neurosurgical treatments. As the practice is rather large, the doctors have several "specialties" in which they focus in, i.e. back surgery, facial surgery, brain surgery, etc.
They have a main website (examplepractice.com) which holds ALL of their content on each condition and treatment in which they deal with. So, if someone enters their main homepage they will see conditions and treatments for all the specialties categorized together.
However, linked within the main site are "mini-sites" for each specialty (same domain, same site) (examplepractice.com/brain-surgery), but with a different navigation menu to give the illusion of "separate website". These mini-sites are then tailored from a creative, content and UX perspective to THAT specific group of treatments and conditions. Now, anyone who enters this minisite will find information pertaining to only that specialty. The mini-sites are NOT set up as folders, but rather just a system of URLs that we have mapped out to each page.
We set up the pages this way to maintain an exclusive feel for the site. Instead of someone drilling into a specific condition and having the menu change, we created the copies. But, because of how this is set up, we now have duplicate content for each treatment and condition child page (one on the main site, one on the minisite).
My question (finally) is will this cause a problem in the future? Are we essentially splitting the "juice" between these two pages? Are we making it easier for our competitors to outrank us?
We know this layout makes sense from the perspective of a user, but we're unclear how to move forward from a search perspective.
Any tips?
-
Thanks, Jake!
-
My first thought would be to simply canonical the pages to the preferred version you would want a visitor to find in search results (I'm guessing the mini-site versions b/c of the UX benefit of someone searching a specific procedure/condition)..
This will tell Google that you know it is a copy, and indicate where all authority should be passed.
https://moz.com/learn/seo/canonicalization is a great resource you should check out on the topic.
Cheers,
Jake Bohall
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Unsolved Duplicate Content
We have multiple collections being flagged as duplicate content - but I can't find where these duplications are coming from? The duplicate content has no introductory text, and no meta description. Please see examples:- This is the correct collection page:-
Technical SEO | | Caroline_Ardmoor
https://www.ardmoor.co.uk/collections/deerhunter This is the incorrect collection page:-
https://www.ardmoor.co.uk/collections/vendors How do I stop this incorrect page from showing?0 -
Image centric site and duplicate content issues
We have a site that has very little text, the main purpose of the site is to allow users to find inspiration through images. 1000s of images come to us each week to be processed by our editorial team, so as part of our process we select a subset of the best images and process those with titles, alt text, tags, etc. We still host the other images and users can find them through galleries that link to the process and unprocessed image pages. Due to the lack of information on the unprocessed images, we are having lots of duplicate content issues (The layout of all the image pages are the same, and there isn't any unique text to differentiate the pages. The only changing factor is the image itself in each page) Any suggestions on how to resolve this issue, will be greatly appreciated.
Technical SEO | | wedlinkmedia0 -
301 duplicate content dynamic url
I have a number of pages that appear as duplicate titles in google webmaster. They all have to do with a brand name query. I want to 301 these pages since I'm going to relaunch my new website on wordpress and don't want to have 404s on these pages. a simple 301 redirect doesn't work since they are dynamic urls. here is an example: /kidsfashionnetherlands/mimpi.html?q=brand%3Amim+pi%3A&page=2&sort=relevance /kidsfashionnetherlands/mimpi.html?q=mim+pi&page=3&sort=relevance /kidsfashionnetherlands/mimpi.html?q=mim+pi&page=5&sort=relevance should all be 301 to the original page that I want to remain indexed: /kidsfashionnetherlands/mimpi.html I have a lot of these but for different queries. Should I do a 301 on each of them to avoid having 404s when I change my site to wordpress? Thanks
Technical SEO | | dashinfashion0 -
Duplicate content - Quickest way to recover?
We've recently been approached by a new client who's had a 60%+ drop in organic traffic. One of the major issues we found was around 60k+ pages of content duplicated across 3 seperate domains. After much discussion and negotiation with them; we 301'd all the pages across to the best domain but traffic is increasing very slowly. Given that the old sites are 60k+ pages each and don't get crawled very often, is it best to notify the domain change through Google Webmaster Tools to try and give Google a 'nudge' to deindex the old pages and hopefully recover from the traffic loss as quickly and as much as possible?
Technical SEO | | Nathan.Smith0 -
Duplicate Content on Multinational Sites?
Hi SEOmozers Tried finding a solution to this all morning but can't, so just going to spell it out and hope someone can help me! Pretty simple, my client has one site www.domain.com. UK-hosted and targeting the UK market. They want to launch www.domain.us, US-hosted and targeting the US market. They don't want to set up a simple redirect because a) the .com is UK-hosted b) there's a number of regional spelling changes that need to be made However, most of the content on domain.com applies to the US market and they want to copy it onto the new website. Are there ways to get around any duplicate content issues that will arise here? Or is the only answer to simply create completely unique content for the new site? Any help much appreciated! Thanks
Technical SEO | | Coolpink0 -
Duplicate Content Issue
Hello, We have many pages in our crawler report that are showing duplicate content. However, the content is not duplicateon the pages. It is somewhat close, but different. I am not sure how to fix the problem so it leaves our report. Here is an example. It is showing these as duplicate content to each other. www.soccerstop.com/c-119-womens.aspx www.soccerstop.com/c-120-youth.aspx www.soccerstop.com/c-124-adult.aspx Any help you could provide would be most appreciated. I am going through our crawler report and resolving issues, and this seems to be big one for us with lots in the report, but not sure what to do about it. Thanks
Technical SEO | | SoccerStop
James0 -
What are some of the negative effects of having duplicate content from other sites?
This could include republishing several articles from another site with permission.
Technical SEO | | Charlessipe0 -
The Bible and Duplicate Content
We have our complete set of scriptures online, including the Bible at http://lds.org/scriptures. Users can browse to any of the volumes of scriptures. We've improved the user experience by allowing users to link to specific verses in context which will scroll to and highlight the linked verse. However, this creates a significant amount of duplicate content. For example, these links: http://lds.org/scriptures/nt/james/1.5 http://lds.org/scriptures/nt/james/1.5-10 http://lds.org/scriptures/nt/james/1 All of those will link to the same chapter in the book of James, yet the first two will highlight the verse 5 and verses 5-10 respectively. This is a good user experience because in other sections of our site and on blogs throughout the world webmasters link to specific verses so the reader can see the verse in context of the rest of the chapter. Another bible site has separate html pages for each verse individually and tends to outrank us because of this (and possibly some other reasons) for long tail chapter/verse queries. However, our tests indicated that the current version is preferred by users. We have a sitemap ready to publish which includes a URL for every chapter/verse. We hope this will improve indexing of some of the more popular verses. However, Googlebot is going to see some duplicate content as it crawls that sitemap! So the question is: is the sitemap a good idea realizing that we can't revert back to including each chapter/verse on its own unique page? We are also going to recommend that we create unique titles for each of the verses and pass a portion of the text from the verse into the meta description. Will this perhaps be enough to satisfy Googlebot that the pages are in fact unique? They certainly are from a user perspective. Thanks all for taking the time!
Technical SEO | | LDS-SEO0