Unavoidable duplicate page
-
Hi,
I have an issue where I need to duplicate content on a new site that I am launching. Visitors to the site need to think that product x is part of two different services. e.g.
Re-writing content for product x for each service section is not an option but possibly I could get over that only one product-x page is indexed by search engines. What's the best way to do this? Any advice would be appreciated.
Thanks,
Stuart
-
Thanks for your help with this. In the end we decided to go down the nofollow route after consulting the developer with your suggestions.
Stu
-
It's okay, we all just want to give you the best answer we can. It can be tough without specifics to give you actionable advice, but I'll keep trying!
My gut feeling here (and please someone correct me if I'm wrong) is that if there is any appreciable difference in content (as you indicate above), then the dupe content thing may not be a problem. This sort of problem only exists for pages with multiple urls (like in more than one subdomain) or old mirrors that were never taken down and such. If you offer similar services with alot of the same content but slightly different service sections, you shouldn't be flagged for dupes.
Why not skip any tactics for now, get the site launched, remove the older content, and use Moz to run a crawl and see if any further content is flagged? Sounds to me like this is a borderline dupe content issue and may not really be a problem for the engines. I don't think I can really say until the pages are live. Let us know when they are!
-
The content in the 'details' subsections are identical between service 1 and 2, but the details sections aren't identical within their own service sections. Anyone confused yet?! I'm really sorry I can't share any links of the site, it's currently still in development.
Stu
-
Tentative yes. If the subsections are different "pages" it should work the same.
However, since you aren't specific with the URLs, I can't visit the pages to see how exactly appropriate this is. I've optimized thousands of paginated and product pages with this problem, and rel-canonicaling (yes I made up that verb) them worked best for us. If you'd like a more accurate answer, feel free to share example URLs for me to poke around in.
-
That DOES sound cool...
So will that also work for any sub-sections within the product-x section too? I probably should have mentioned but product-x has a sub set of pages which arn't unique e.g.
domain.com/service1/product-x/product-x-details-1
domain.com/service1/product-x/product-x-details-2
domain.com/service2/product-x/product-x-details-1
domain.com/service2/product-x/product-x-details-2
Stu
-
So if you have to FORCE the engines to prioritize one of multiple duplicates, you have a few options. Assuming you want both pages to exist, you can provide a nofollow or noindex rule to one of the duplicates. This is a blunt approach which works well but not really the coolest.
The coolest option is to give one a canonical tag. Telling the engines that one of the multiple duplicates is the "canonical" one is what Google even recommends (and it has all sorts of neat downstream SEO benefits anyway.
So add a "rel canonical" tag to the "proper" page. Matt lays it out here:https://support.google.com/webmasters/answer/139394?hl=en
Let us know how it goes!
-
Hi Peter,
Thanks for your response. I probably should have said but I have vastly simplified the description of this problem. Product-x is actually made of lots of components and sub pages within itself. Which is the reason I can't re-write product-x for each service section.
So there will be lots of duplicate pages within each product-x section. The service 1 and 2 sections are unique however.
Stuart
-
Hi Stuart
Is product-x just one component in each of these services along with other components?
If so, could you not solve this by having content on each of the service pages about those services and then a small amount of info about product-x on each of those pages with a read more type link to product-x on the actual product-x page.
So you would have:
domain.com/service1
domain.com/service2both of which point to a page showing product-x at domain.com/product-x
If that is not possible then there isn't a way around having duplicate pages as I can see it.
Peter
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Added 301 redirects, pages still earning duplicate content warning
We recently added a number of 301 redirects for duplicate content pages, but even with this addition they are still showing up as duplicate content. Am I missing something here? Or is this a duplicate content warning I should ignore?
Technical SEO | | cglife0 -
Does adding subcategory pages to an commerce site limit the link juice to the product pages?
I have a client who has an online outdoor gear company. He mostly sells high end outdoor gear (like ski jackets, vests, boots, etc) at a deep discount. His store currently only resides on Ebay. So we're building him an online store from scratch. I'm trying to determine the best site architecture and wonder if we should include subcategory pages. My issue is that I think the subcategory pages might be good from a user experience, but it'll add an additional layer between the homepage and the product pages. The problem is that I think a lot of user's might be searching for the product name to see if they can find a better deal, and my client's site would be perfect for them. So I really want to rank well for the product pages, but I'm nervous that the subcategory pages will limit the link juice of the product pages. Home --> SubCategory --> Product List --> Product Detail Home --> Men's Ski Clothing --> Men's Ski Jack --> North Face Mt Everest Jacket Should I keep the SubCategory page "Men's Ski Clothing" if it helps usability? On a separate note, the SubCategory pages would have some head keyword terms, but I don't think that he could rank well for these terms anytime soon. However, they would be great pages / terms to rank for in the long term. Should this influence the decision?
Technical SEO | | Santaur0 -
Pages extensions
Hi guys, We're in the process of moving one of our sites to a newer version of the CMS. The new version doesn't support page extensions (.aspx) but we'll keep them for all existing pages (about 8,000) to avoid redirects. The technical team is wondering about the new pages - does it make any difference if the new pages are without extensions, except for usability? Thanks!
Technical SEO | | lgrozeva0 -
Duplicate pages in Google index despite canonical tag and URL Parameter in GWMT
Good morning Moz... This is a weird one. It seems to be a "bug" with Google, honest... We migrated our site www.three-clearance.co.uk to a Drupal platform over the new year. The old site used URL-based tracking for heat map purposes, so for instance www.three-clearance.co.uk/apple-phones.html ..could be reached via www.three-clearance.co.uk/apple-phones.html?ref=menu or www.three-clearance.co.uk/apple-phones.html?ref=sidebar and so on. GWMT was told of the ref parameter and the canonical meta tag used to indicate our preference. As expected we encountered no duplicate content issues and everything was good. This is the chain of events: Site migrated to new platform following best practice, as far as I can attest to. Only known issue was that the verification for both google analytics (meta tag) and GWMT (HTML file) didn't transfer as expected so between relaunch on the 22nd Dec and the fix on 2nd Jan we have no GA data, and presumably there was a period where GWMT became unverified. URL structure and URIs were maintained 100% (which may be a problem, now) Yesterday I discovered 200-ish 'duplicate meta titles' and 'duplicate meta descriptions' in GWMT. Uh oh, thought I. Expand the report out and the duplicates are in fact ?ref= versions of the same root URL. Double uh oh, thought I. Run, not walk, to google and do some Fu: http://is.gd/yJ3U24 (9 versions of the same page, in the index, the only variation being the ?ref= URI) Checked BING and it has indexed each root URL once, as it should. Situation now: Site no longer uses ?ref= parameter, although of course there still exists some external backlinks that use it. This was intentional and happened when we migrated. I 'reset' the URL parameter in GWMT yesterday, given that there's no "delete" option. The "URLs monitored" count went from 900 to 0, but today is at over 1,000 (another wtf moment) I also resubmitted the XML sitemap and fetched 5 'hub' pages as Google, including the homepage and HTML site-map page. The ?ref= URls in the index have the disadvantage of actually working, given that we transferred the URL structure and of course the webserver just ignores the nonsense arguments and serves the page. So I assume Google assumes the pages still exist, and won't drop them from the index but will instead apply a dupe content penalty. Or maybe call us a spam farm. Who knows. Options that occurred to me (other than maybe making our canonical tags bold or locating a Google bug submission form 😄 ) include A) robots.txt-ing .?ref=. but to me this says "you can't see these pages", not "these pages don't exist", so isn't correct B) Hand-removing the URLs from the index through a page removal request per indexed URL C) Apply 301 to each indexed URL (hello BING dirty sitemap penalty) D) Post on SEOMoz because I genuinely can't understand this. Even if the gap in verification caused GWMT to forget that we had set ?ref= as a URL parameter, the parameter was no longer in use because the verification only went missing when we relaunched the site without this tracking. Google is seemingly 100% ignoring our canonical tags as well as the GWMT URL setting - I have no idea why and can't think of the best way to correct the situation. Do you? 🙂 Edited To Add: As of this morning the "edit/reset" buttons have disappeared from GWMT URL Parameters page, along with the option to add a new one. There's no messages explaining why and of course the Google help page doesn't mention disappearing buttons (it doesn't even explain what 'reset' does, or why there's no 'remove' option).
Technical SEO | | Tinhat0 -
Duplicate content by php id,page=... problem
Hi dear friends! How can i resolve this duplicate problem with edit the php code file? My trouble is google find that : http://vietnamfoodtour.com/?mod=booking&act=send_booking&ID=38 and http://vietnamfoodtour.com/.....booking.html are different page, but they are one but google indexed both of them. And the Duplcate content is raised 😞 how can i notice to google that they are one?
Technical SEO | | magician0 -
Are aggregate sites penalised for duplicate page content?
Hi all,We're running a used car search engine (http://autouncle.dk/en/) in Denmark, Sweden and soon Germany. The site works in a conventional search engine way with a search form and pages of search results (car adverts).The nature of car searching entails that the same advert exists on a large number of different urls (because of the many different search criteria and pagination). From my understanding this is problematic because Google will penalize the site for having duplicated content. Since the order of search results is mixed, I assume SEOmoz cannot always identify almost identical pages so the problem is perhaps bigger than what SEOmoz can tell us. In your opinion, what is the best strategy to solve this? We currently use a very simple canonical solution.For the record, besides collecting car adverts AutoUncle provide a lot of value to our large user base (including valuations on all cars) . We're not just another leech adword site. In fact, we don't have a single banner.Thanks in advance!
Technical SEO | | JonasNielsen0 -
Duplicate Page Content
I've got several pages of similar products that google has listed as duplicate content. I have them all set up with rel="prev" and rel="next tags telling google that they are part of a group but they've still got them listed as duplicates. Is there something else I should do for these pages or is that just a short falling of googles webmaster tools? One of the pages: http://www.jaaronwoodcountertops.com/wood-countertop-gallery/walnut-countertop-9.html
Technical SEO | | JAARON0 -
Duplicate Page Title
The crawl of my website http://www.aboutaburningfire.com revealed an error showing a duplicate page title. Can someone please explain to me how to fix this? I'm not sure what it means or how to fix it. | House Church Chicago, Organic Church, Illinois http://www.aboutaburningfire.com/ 1 Pending Pending House Church Chicago, Organic Church, Illinois http://www.aboutaburningfire.com/index.html |
Technical SEO | | severity0