Unavoidable duplicate page
-
Hi,
I have an issue where I need to duplicate content on a new site that I am launching. Visitors to the site need to think that product x is part of two different services. e.g.
Re-writing content for product x for each service section is not an option but possibly I could get over that only one product-x page is indexed by search engines. What's the best way to do this? Any advice would be appreciated.
Thanks,
Stuart
-
Thanks for your help with this. In the end we decided to go down the nofollow route after consulting the developer with your suggestions.
Stu
-
It's okay, we all just want to give you the best answer we can. It can be tough without specifics to give you actionable advice, but I'll keep trying!
My gut feeling here (and please someone correct me if I'm wrong) is that if there is any appreciable difference in content (as you indicate above), then the dupe content thing may not be a problem. This sort of problem only exists for pages with multiple urls (like in more than one subdomain) or old mirrors that were never taken down and such. If you offer similar services with alot of the same content but slightly different service sections, you shouldn't be flagged for dupes.
Why not skip any tactics for now, get the site launched, remove the older content, and use Moz to run a crawl and see if any further content is flagged? Sounds to me like this is a borderline dupe content issue and may not really be a problem for the engines. I don't think I can really say until the pages are live. Let us know when they are!
-
The content in the 'details' subsections are identical between service 1 and 2, but the details sections aren't identical within their own service sections. Anyone confused yet?! I'm really sorry I can't share any links of the site, it's currently still in development.
Stu
-
Tentative yes. If the subsections are different "pages" it should work the same.
However, since you aren't specific with the URLs, I can't visit the pages to see how exactly appropriate this is. I've optimized thousands of paginated and product pages with this problem, and rel-canonicaling (yes I made up that verb) them worked best for us. If you'd like a more accurate answer, feel free to share example URLs for me to poke around in.
-
That DOES sound cool...
So will that also work for any sub-sections within the product-x section too? I probably should have mentioned but product-x has a sub set of pages which arn't unique e.g.
domain.com/service1/product-x/product-x-details-1
domain.com/service1/product-x/product-x-details-2
domain.com/service2/product-x/product-x-details-1
domain.com/service2/product-x/product-x-details-2
Stu
-
So if you have to FORCE the engines to prioritize one of multiple duplicates, you have a few options. Assuming you want both pages to exist, you can provide a nofollow or noindex rule to one of the duplicates. This is a blunt approach which works well but not really the coolest.
The coolest option is to give one a canonical tag. Telling the engines that one of the multiple duplicates is the "canonical" one is what Google even recommends (and it has all sorts of neat downstream SEO benefits anyway.
So add a "rel canonical" tag to the "proper" page. Matt lays it out here:https://support.google.com/webmasters/answer/139394?hl=en
Let us know how it goes!
-
Hi Peter,
Thanks for your response. I probably should have said but I have vastly simplified the description of this problem. Product-x is actually made of lots of components and sub pages within itself. Which is the reason I can't re-write product-x for each service section.
So there will be lots of duplicate pages within each product-x section. The service 1 and 2 sections are unique however.
Stuart
-
Hi Stuart
Is product-x just one component in each of these services along with other components?
If so, could you not solve this by having content on each of the service pages about those services and then a small amount of info about product-x on each of those pages with a read more type link to product-x on the actual product-x page.
So you would have:
domain.com/service1
domain.com/service2both of which point to a page showing product-x at domain.com/product-x
If that is not possible then there isn't a way around having duplicate pages as I can see it.
Peter
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
New site: More pages for usability, or fewer more detailed pages for greater domain authority flow?
Ladies and gents! We're building a new site. We have a list of 28 professions, and we're wondering whether or not to include them all on one long and detailed page, or to keep them on their own separate pages. Thinking about the flow of domain authority - I could see 28 pages diluting it quite heavily - but at the same time, I think having the separate pages would be better for the user. What do you think?
Technical SEO | | Muhammad-Isap1 -
Is this duplicate content?
All the pages have same information but content is little bit different, is this low quality and considered as duplicate content? I only trying to make services pages for each city, any other way for doing this. http://www.progressivehealthofpa.com/brain-injury-rehabilitation-pennsylvania/
Technical SEO | | JordanBrown
http://www.progressivehealthofpa.com/brain-injury-rehabilitation-new-york/
http://www.progressivehealthofpa.com/brain-injury-rehabilitation-new-jersey/
http://www.progressivehealthofpa.com/brain-injury-rehabilitation-connecticut/
http://www.progressivehealthofpa.com/brain-injury-rehabilitation-maryland/
http://www.progressivehealthofpa.com/brain-injury-rehabilitation-massachusetts/
http://www.progressivehealthofpa.com/brain-injury-rehabilitation-philadelphia/
http://www.progressivehealthofpa.com/brain-injury-rehabilitation-new-york-city/
http://www.progressivehealthofpa.com/brain-injury-rehabilitation-baltimore/
http://www.progressivehealthofpa.com/brain-injury-rehabilitation-boston/0 -
Why is it the crawler saying I have 9 Duplicate Page Titles?
Hi, I received my weekly web crawl and it is saying this: | 4 | Duplicate Page Content |
Technical SEO | | afrohairsolutions
| 22 | Missing Meta Description Tag |
| 9 | Duplicate Page Title |
| 1 | Title Element Too Long (> 70 Characters) |
| 1 | Title Element Too Short |
| 1 | 301 (Permanent Redirect) | I'm new to SEO and don't know how to fix this, I don't really see how I have Duplicate Page Content or Duplicate Page Title. This is my website: afrohairsolutions.co.uk Thank you in advance.0 -
Page titles in browser not matching WP page title
I have an issue with a few page titles not matching the title I have In WordPress. I have 2 pages, blog & creative gallery, that show the homepage title, which is causing duplicate title errors. This has been going on for 5 weeks, so its not an a crawl issue. Any ideas what could cause this? To clarify, I have the page title set in WP, and I checked "Disable PSP title format on this page/post:"...but this page is still showing the homepage title. Is there an additional title setting for a page in WP?
Technical SEO | | Branden_S0 -
My number of duplicate page title and temporary redirect warnings increased after I enabled Canonical urls. Why? Is this normal?
After receiving my first SEO moz report, I had some duplicate page titles and temporary redirects. I was told enabling Canonical urls would take of this. I enabled the Canonical URLs, but the next report showed that both of those problems had increased three fold after enabled the canonical urls! What happened?
Technical SEO | | btsseo780 -
What is the best way to find missing alt tags on my site (site wide - not page by page)?
I am looking to find all the missing alt tags on my site at once. I have a FF extension that use to do it page by page, but my site is huge and that will take forever. Thanks!!
Technical SEO | | franchisesolutions1 -
I have a ton of "duplicated content", "duplicated titles" in my website, solutions?
hi and thanks in advance, I have a Jomsocial site with 1000 users it is highly customized and as a result of the customization we did some of the pages have 5 or more different types of URLS pointing to the same page. Google has indexed 16.000 links already and the cowling report show a lot of duplicated content. this links are important for some of the functionality and are dynamically created and will continue growing, my developers offered my to create rules in robots file so a big part of this links don't get indexed but Google webmaster tools post says the following: "Google no longer recommends blocking crawler access to duplicate content on your website, whether with a robots.txt file or other methods. If search engines can't crawl pages with duplicate content, they can't automatically detect that these URLs point to the same content and will therefore effectively have to treat them as separate, unique pages. A better solution is to allow search engines to crawl these URLs, but mark them as duplicates by using the rel="canonical" link element, the URL parameter handling tool, or 301 redirects. In cases where duplicate content leads to us crawling too much of your website, you can also adjust the crawl rate setting in Webmaster Tools." here is an example of the links: | | http://anxietysocialnet.com/profile/edit-profile/salocharly http://anxietysocialnet.com/salocharly/profile http://anxietysocialnet.com/profile/preferences/salocharly http://anxietysocialnet.com/profile/salocharly http://anxietysocialnet.com/profile/privacy/salocharly http://anxietysocialnet.com/profile/edit-details/salocharly http://anxietysocialnet.com/profile/change-profile-picture/salocharly | | so the question is, is this really that bad?? what are my options? it is really a good solution to set rules in robots so big chunks of the site don't get indexed? is there any other way i can resolve this? Thanks again! Salo
Technical SEO | | Salocharly0 -
Page crawling is only seeing a portion of the pages. Any Advice?
last couple of page crawls have returned 14 out of 35 pages. Is there any suggestions I can take.
Technical SEO | | cubetech0