Unavoidable duplicate page
-
Hi,
I have an issue where I need to duplicate content on a new site that I am launching. Visitors to the site need to think that product x is part of two different services. e.g.
Re-writing content for product x for each service section is not an option but possibly I could get over that only one product-x page is indexed by search engines. What's the best way to do this? Any advice would be appreciated.
Thanks,
Stuart
-
Thanks for your help with this. In the end we decided to go down the nofollow route after consulting the developer with your suggestions.
Stu
-
It's okay, we all just want to give you the best answer we can. It can be tough without specifics to give you actionable advice, but I'll keep trying!
My gut feeling here (and please someone correct me if I'm wrong) is that if there is any appreciable difference in content (as you indicate above), then the dupe content thing may not be a problem. This sort of problem only exists for pages with multiple urls (like in more than one subdomain) or old mirrors that were never taken down and such. If you offer similar services with alot of the same content but slightly different service sections, you shouldn't be flagged for dupes.
Why not skip any tactics for now, get the site launched, remove the older content, and use Moz to run a crawl and see if any further content is flagged? Sounds to me like this is a borderline dupe content issue and may not really be a problem for the engines. I don't think I can really say until the pages are live. Let us know when they are!
-
The content in the 'details' subsections are identical between service 1 and 2, but the details sections aren't identical within their own service sections. Anyone confused yet?! I'm really sorry I can't share any links of the site, it's currently still in development.
Stu
-
Tentative yes. If the subsections are different "pages" it should work the same.
However, since you aren't specific with the URLs, I can't visit the pages to see how exactly appropriate this is. I've optimized thousands of paginated and product pages with this problem, and rel-canonicaling (yes I made up that verb) them worked best for us. If you'd like a more accurate answer, feel free to share example URLs for me to poke around in.
-
That DOES sound cool...
So will that also work for any sub-sections within the product-x section too? I probably should have mentioned but product-x has a sub set of pages which arn't unique e.g.
domain.com/service1/product-x/product-x-details-1
domain.com/service1/product-x/product-x-details-2
domain.com/service2/product-x/product-x-details-1
domain.com/service2/product-x/product-x-details-2
Stu
-
So if you have to FORCE the engines to prioritize one of multiple duplicates, you have a few options. Assuming you want both pages to exist, you can provide a nofollow or noindex rule to one of the duplicates. This is a blunt approach which works well but not really the coolest.
The coolest option is to give one a canonical tag. Telling the engines that one of the multiple duplicates is the "canonical" one is what Google even recommends (and it has all sorts of neat downstream SEO benefits anyway.
So add a "rel canonical" tag to the "proper" page. Matt lays it out here:https://support.google.com/webmasters/answer/139394?hl=en
Let us know how it goes!
-
Hi Peter,
Thanks for your response. I probably should have said but I have vastly simplified the description of this problem. Product-x is actually made of lots of components and sub pages within itself. Which is the reason I can't re-write product-x for each service section.
So there will be lots of duplicate pages within each product-x section. The service 1 and 2 sections are unique however.
Stuart
-
Hi Stuart
Is product-x just one component in each of these services along with other components?
If so, could you not solve this by having content on each of the service pages about those services and then a small amount of info about product-x on each of those pages with a read more type link to product-x on the actual product-x page.
So you would have:
domain.com/service1
domain.com/service2both of which point to a page showing product-x at domain.com/product-x
If that is not possible then there isn't a way around having duplicate pages as I can see it.
Peter
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Unsolved Duplicate Content
We have multiple collections being flagged as duplicate content - but I can't find where these duplications are coming from? The duplicate content has no introductory text, and no meta description. Please see examples:- This is the correct collection page:-
Technical SEO | | Caroline_Ardmoor
https://www.ardmoor.co.uk/collections/deerhunter This is the incorrect collection page:-
https://www.ardmoor.co.uk/collections/vendors How do I stop this incorrect page from showing?0 -
Is my page being indexed?
To put you all in context, here is the situation, I have pages that are only accessible via an intern search tool that shows the best results for the request. Let's say i want to see the result on page 2, the page 2 will have a request in the url like this: ?p=2&s=12&lang=1&seed=3688 The situation is that we've disallowed every URL's that contains a "?" in the robots.txt file which means that Google doesn't crawl the page 2,3,4 and so on. If a page is only accessible via page 2, do you think Google will be able to access it? The url of the page is included in the sitemap. Thank you in advance for the help!
Technical SEO | | alexrbrg0 -
404 Error Pages being picked up as duplicate content
Hi, I recently noticed an increase in duplicate content, but all of the pages are 404 error pages. For instance, Moz site crawl says this page: https://www.allconnect.com/sc-internet/internet.html has 43 duplicates and all the duplicates are also 404 pages (https://www.allconnect.com/Coxstatic.html for instance is a duplicate of this page). Looking for insight on how to fix this issue, do I add an rel=canonical tag to these 60 error pages that points to the original error page? Thanks!
Technical SEO | | kfallconnect0 -
Duplicating Keywords in Page Title
Hello All, I am currently trying to establish the TITLE tag of my homepage. I am trying to target 2 terms plus my company name. For example, purposes, the two keywords are: Widget Program Widget Software My company Name is: Widget Direct I originally had the title as: Widget Program | Software | Widget Direct My thought was that I didn't want to repeat the word "Widget" too many times. However, the SEOmoz on-page report card keeps telling me I should have the exact keyword in my title tag. In that case it would make the title: Widget Program | Widget Software | Widget Direct Do you think that is better so that I have each keyword in the title or will that result in a penalty because it looks like I'm stuffing the title with the keyword 'widget'? Any insight is greatly appreciated! Thanks!
Technical SEO | | Robert-B0 -
Is using a customer quote on multiple pages duplicate content?
Is there any risk with placing the same customer quote (3-4 sentences) on multiple pages on your site?
Technical SEO | | Charlessipe0 -
Is it better to delete web pages that I don't want anymore or should I 301 redirect all of the pages I delete to the homepage or another live page?
Is it better for SEO to delete web pages that I don't want anymore or should I 301 redirect all of the pages I delete to the homepage or another live page?
Technical SEO | | CustomOnlineMarketing0 -
What's the best way to eliminate duplicate page content caused by blog archives?
I (obviously) can't delete the archived pages regardless of how much traffic they do/don't receive. Would you recommend a meta robot or robot.txt file? I'm not sure I'll have access to the root directory so I could be stuck with utilizing a meta robot, correct? Any other suggestions to alleviate this pesky duplicate page content issue?
Technical SEO | | ICM0 -
Snippets on every page considered duplicate content?
If I create a page that pulls a 10 snippets of information from various external site, would that content be considered duplicate content? If I link to the source, would it be recommended to use a "nofollow" tag?
Technical SEO | | nicole.healthline0