Unavoidable duplicate page
-
Hi,
I have an issue where I need to duplicate content on a new site that I am launching. Visitors to the site need to think that product x is part of two different services. e.g.
Re-writing content for product x for each service section is not an option but possibly I could get over that only one product-x page is indexed by search engines. What's the best way to do this? Any advice would be appreciated.
Thanks,
Stuart
-
Thanks for your help with this. In the end we decided to go down the nofollow route after consulting the developer with your suggestions.
Stu
-
It's okay, we all just want to give you the best answer we can. It can be tough without specifics to give you actionable advice, but I'll keep trying!
My gut feeling here (and please someone correct me if I'm wrong) is that if there is any appreciable difference in content (as you indicate above), then the dupe content thing may not be a problem. This sort of problem only exists for pages with multiple urls (like in more than one subdomain) or old mirrors that were never taken down and such. If you offer similar services with alot of the same content but slightly different service sections, you shouldn't be flagged for dupes.
Why not skip any tactics for now, get the site launched, remove the older content, and use Moz to run a crawl and see if any further content is flagged? Sounds to me like this is a borderline dupe content issue and may not really be a problem for the engines. I don't think I can really say until the pages are live. Let us know when they are!
-
The content in the 'details' subsections are identical between service 1 and 2, but the details sections aren't identical within their own service sections. Anyone confused yet?! I'm really sorry I can't share any links of the site, it's currently still in development.
Stu
-
Tentative yes. If the subsections are different "pages" it should work the same.
However, since you aren't specific with the URLs, I can't visit the pages to see how exactly appropriate this is. I've optimized thousands of paginated and product pages with this problem, and rel-canonicaling (yes I made up that verb) them worked best for us. If you'd like a more accurate answer, feel free to share example URLs for me to poke around in.
-
That DOES sound cool...
So will that also work for any sub-sections within the product-x section too? I probably should have mentioned but product-x has a sub set of pages which arn't unique e.g.
domain.com/service1/product-x/product-x-details-1
domain.com/service1/product-x/product-x-details-2
domain.com/service2/product-x/product-x-details-1
domain.com/service2/product-x/product-x-details-2
Stu
-
So if you have to FORCE the engines to prioritize one of multiple duplicates, you have a few options. Assuming you want both pages to exist, you can provide a nofollow or noindex rule to one of the duplicates. This is a blunt approach which works well but not really the coolest.
The coolest option is to give one a canonical tag. Telling the engines that one of the multiple duplicates is the "canonical" one is what Google even recommends (and it has all sorts of neat downstream SEO benefits anyway.
So add a "rel canonical" tag to the "proper" page. Matt lays it out here:https://support.google.com/webmasters/answer/139394?hl=en
Let us know how it goes!
-
Hi Peter,
Thanks for your response. I probably should have said but I have vastly simplified the description of this problem. Product-x is actually made of lots of components and sub pages within itself. Which is the reason I can't re-write product-x for each service section.
So there will be lots of duplicate pages within each product-x section. The service 1 and 2 sections are unique however.
Stuart
-
Hi Stuart
Is product-x just one component in each of these services along with other components?
If so, could you not solve this by having content on each of the service pages about those services and then a small amount of info about product-x on each of those pages with a read more type link to product-x on the actual product-x page.
So you would have:
domain.com/service1
domain.com/service2both of which point to a page showing product-x at domain.com/product-x
If that is not possible then there isn't a way around having duplicate pages as I can see it.
Peter
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What should I do about not found pages?
I took over a site that had been hacked. A bunch of pages were created that said domain.com/cms/viagra. The pages are gone but they still show in webmaster tools as not being found, which is what I want. However, should I do anything besides leaving them as 404?
Technical SEO | | EcommerceSite0 -
Advice on whether we 301 redirect a page or update existing page?
Hi guys, any advice would be really appreciated. We have an existing page that ranks well for 'red widgets'. The page isn't monetised right now, but we're bringing in a new product onto our site that we optimised for 'blue widgets'. Unfortunately, not enough research was done for this page and we've now realised that consumers actually search for 'red widgets' when looking for the product we're creating as 'blue widgets'. The problem with this is that the 'red widgets' page is in a completely different category of our site than what it needs to be (it needs to be with 'blue widgets'). So, my question is; Should we do a 301 redirect from our 'red-widgets' page to our 'blue-widgets' page which we want to update and optimise the content on there for 'red-widgets'. Or, should we update the existing red-widgets page to have the right products and content on there, even thought it is in the wrong place of our site and users could get confused as to why they are there. If we do a 301 redirect to our new page, will we lose our rankings and have to start again, or is there a better way around this? Thanks! Dave
Technical SEO | | davo230 -
How to make my good sub-page rank ahead of my generic home page?
I have an ecommerce site for the clothes drying racks my family business makes, and it sells a few other laundry items also. It's about 5 years old. We used to rank on the first page for basic phrases like "clothes drying rack" and "umbrella clothesline". About 1.5 years ago we fell hard in the rankings. Since then "umbrella clothesline" has moved back to the first page, but "clothes drying rack" is stuck on the 3rd page and always with the result being the generic homepage instead of the good sub-page (which used to rank on the first page) that really shows-n-tells about our drying rack. Here are the three pages I am talking about. Home page = http://www.bestdryingrack.com/ Drying rack page = http://www.bestdryingrack.com/clothes-drying-rack-main.html and umbrella clothesline page = http://www.bestdryingrack.com/umbrella-clotheslines.html Any ideas on how to get the drying rack page to start ranking well again? (hopefully better than the generic homepage ranks) A little technical background: the Moz campaign on this site says that the home page has a PA = 42 with 190 LRD's and 344 external links. Both the umbrella clothesline page and the clothes drying rack page have almost equal statistics of PA = 35 with 20 LRD's and 23 external links. My anchor text distribution is maybe unbalanced. The drying rack page has 15 external links with the anchor of "Clothes Drying Rack". But the umbrella clothesline page has 14 external links with the anchor of "outdoor umbrella clothesline" and it ranks on the first page for that search. I can't figure out how to get OSE to tell me anchor text stats for just the homepage and not the whole site since www.bestdryingrack.com/index.html 301's to the plain www.bestdryingrack.com (if you know how, please share) What's wrong with my poor neglected clothes drying rack page? The only way I can get it to show up on the first page is to do a real specific search like "round wooden clothes drying rack" Your help could save a faltering family business. Thank you!
Technical SEO | | GregB1230 -
Should I deindex my pages?
I recently changed the URLs on a website to make them tidier and easier to follow. I put 301s in place to direct all the previous page names to the new ones. However, I didn't read moz's guide which says I should leave the old sitemap online for a few weeks afterwards. As I result, webmaster tools is showing duplicate page titles (which means duplicate pages) for the old versions of the pages I have renamed. Since the old versions are no longer on the sitemap, google can no longer access them to find the 301s I have put in place. Is this a problem that will fix itself over time or is there a way to quicken up the process? I could use webmaster tools to remove these old urls, but I'm not sure if this is recommended. Alternatively, I could try and recreate the old sitemap, but this would take a lot of time.
Technical SEO | | maxweb0 -
Duplicate page issue
Hi, i have a serious duplicate page issue and not sure how it happened and i am not sure if anyone will be able to help as my site was built in joomla, it has been done through k2, i have never come across this issue before i am seem to have lots of duplicate pages under author names, example http://www.in2town.co.uk/blog/diane-walker this page is showing the full articles which is not great for seo and it is also showing that there are hundreds more articles at the bottom on the semoz tool i am using, it is showing these as duplicates although there are hundreds of them and it is causing google to see lots of duplicate pages. Diane Walker
Technical SEO | | ClaireH-184886
http://www.in2town.co.uk/blog/diane-walker/Page-2 5 1 0
Diane Walker
http://www.in2town.co.uk/blog/diane-walker/Page-210 1 1 0
Diane Walker
http://www.in2town.co.uk/blog/diane-walker/Page-297 1 1 0
Diane Walker
http://www.in2town.co.uk/blog/diane-walker/Page-3 5 1 0
Diane Walker can anyone please help me to sort this important issue out.0 -
Translating Page Titles & Page Descriptions
I am working on a site that will be published in the original English, with localized versions in French, Spanish, Japanese and Chinese. All the versions will use the English information architecture. As part of the process, we will be translating the page the titles and page descriptions. Translation quality will be outstanding. The client is a translation company. Each version will get at least four pairs of eyes including expert translators, editors, QA experts and proofreaders. My question is what special SEO instructions should be issued to translators re: the page titles and page descriptions. (We have to presume the translators know nothing about SEO.) I was thinking of: stick to the character counts for titles and descriptions make sure the title and description work together avoid over repetition of keywords page titles (over-optimization peril) think of the descriptions as marketing copy try to repeat some title phrases in the description (to get the bolding and promote click though) That's the micro stuff. The macro stuff: We haven't done extensive keyword research for the other languages. Most of the clients are in the US. The other language versions are more a demo of translation ability than looking for clients elsewhere. Are we missing something big here?
Technical SEO | | DanielFreedman0 -
Duplicate Content - Home Page even wth Mod Rewrite 301
Hi, It looks like Seomoz (and Screaming Frog) is showing my home page as duplicate content. http://www.mydomain.com Page Authority 61 Linking root Domain 321 http://www.mydomain.com/ Page Authority 61 Linking root Domain 321 [Screaming Frog shows duplicate as]
Technical SEO | | Force7
www.mydomain.com/
www.mydomain.com/index.html} Years ago I hired someone to write the code for a rewrite for non www to be 301 redirected to www version. I was surprised at finding out that I still have a problem. Here is the code on my htaccess page. <ifmodule mod_rewrite.c="">RewriteEngine On
RewriteBase /
RewriteCond %{HTTP_HOST} !^www.mydomain.com [NC]
RewriteRule ^(.*)$ http://www.mydomain.com/$1 [L,R=301]</ifmodule> Was this code not properly written ? One more question, we were hit hard by Panda and Penguin, would something like this be that much of a factor. Thanks in advance, Force70 -
50+ duplicate content pages - Do we remove them all or 301?
We are working on a site that has 50+ pages that all have duplicate content (1 for each state, pretty much). Should we 301 all 50 of the URLs to one URL or should we just completely get rid of all the pages? Are there any steps to take when completely removing pages completely? (submit sitemap to google webmaster tools, etc) thanks!
Technical SEO | | Motava0