How to Fix Duplicate Page Content?
-
Our latest SEOmoz crawl reports 1138 instances of "duplicate page content."
I have long been aware that our duplicate page content is likely a major reason Google has de-valued our Web store.
Our duplicate page content is the result of the following:
1. We sell audio books and use the publisher's description (narrative) of the title. Google is likely recognizing the publisher as the owner / author of the description and our description as duplicate content.
2. Many audio book titles are published in more than one format (abridged, unabridged CD, and/or unabridged MP3) by the same publisher so the basic description on our site would be the same at our Web store for each format = more duplicate content at our Web store.
Here's are two examples (one abridged, one unabridged) of one title at our Web store.
How much would the body content of one of the above pages have to change so that a SEOmoz crawl does NOT say the content is duplicate?
-
Just wanted to add a note that our tools do not detect duplicates across domains or on other websites, so these warnings are completely tied to your own pages/URLs.
These are "near" duplicates in our view, and Takeshi is right - there are many possible solutions. I'm guessing you can't directly combine them, from an e-commerce standpoint, but I would suggest either making a "parent" page and using rel=canonical, or just making sure there's navigation between the formats/versions and then pointing rel=canonical to the most common version (i.e. that your customers buy).
Technically, this will remove one version from ranking consideration, but I think that's preferable to having 100s or 1000s of versions out there and diluting your ranking ability or even having Panda-related problems. It's one thing if you have Amazon's link profile, but the rest of us aren't so lucky.
-
Good question. The canonical tag may be part of our solution.
I am also planning on having a "main" product with the description and any variations (abridged, unabridged, CD, MP3 CD) as subproducts which would use the main products' description. I.E. There would only be one product page with the description, not multiple. This will still result in our main products' page having the same description as the publisher. We have 1000s of audio products. Paying someone or doing it ourselves to create enough unique content on these pages would be prohibitive. Some high ranking competitors of ours have the same description as the publisher so Google must be taking something else into consideration to value them much higher than us.
-
They are saying the pages on your site have duplicate content. Those two pages you linked are a perfect example. The content is exactly the same minus two words, which is more than enough for Google to register it as duplicate..
What I don't understand is what's wrong with a simple canonical tag in this instance? Do you really need both of these indexed?
-
When SEOmoz identifies pages at our Web store with duplicate content is SEOmoz saying one of both of the following:
1. More than one page at our Web store has the same content.
2. One or more pages at our Web store has the same content as another page on the Web.
-
Agreed with everything Takeshi just said, but only left out one thing. Once you combine pages, make sure to 301 redirect the old pages to the new url. If you don't want to combine remember to use rel=canonical to delineate which type of permalink has the authority.
Hope that helps.
-
There are no easy fixes here. Here are a few things that are common practice among etailers to reduce duplicate content:
- Combine similar pages into one. So abridged & unabridged would be on one page, with a drop-down menu to select the different versions of the product.
- Re-write the product descriptions, from scratch (you can hire people to do this).
- Add your own unique content in addition to the provided description, such editorial reviews, recommendations, historical information, product specs, etc.
- Add user reviews, so that users can generate unique content for you.
- Create a unique user experience that improves the shopping experience on your site. Why should a user shop at your store, and not Amazon? Why should Google rank your site above Amazon? What differentiates you?
Like I said, there are no quick fixes for unique content. You either have to re-write the descriptions, add your own unique content, or both.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Web accessibility - High Contrast web pages, duplicate content and SEO
Hi all, I'm working with a client who has various URL variations to display their content in High Contrast and Low Contrast. It feels like quite an old way of doing things. The URLs look like this: domain.com/bespoke-curtain-making/ - Default URL
Intermediate & Advanced SEO | | Bee159
domain.com/bespoke-curtain-making/?style=hc - High Contrast page
domain.com/bespoke-curtain-making/?style=lc - Low Contrast page My questions are: Surely this content is duplicate content according to a search engine Should the different versions have a meta noindex directive in the header? Is there a better way of serving these pages? Thanks.0 -
Best method for blocking a subdomain with duplicated content
Hello Moz Community Hoping somebody can assist. We have a subdomain, used by our CMS, which is being indexed by Google.
Intermediate & Advanced SEO | | KateWaite
http://www.naturalworldsafaris.com/
https://admin.naturalworldsafaris.com/ The page is the same so we can't add a no-index or no-follow.
I have both set up as separate properties in webmaster tools I understand the best method would be to update the robots.txt with a user disallow for the subdomain - but the robots text is only accessible on the main domain. http://www.naturalworldsafaris.com/robots.txt Will this work if we add the subdomain exclusion to this file? It means it won't be accessible on https://admin.naturalworldsafaris.com/robots.txt (where we can't create a file). Therefore won't be seen within that specific webmaster tools property. I've also asked the developer to add a password protection to the subdomain but this does not look possible. What approach would you recommend?0 -
Is all duplication of HTML title content bad?
In light of Hummingbird and that HTML titles are the main selling point in SERPs, is my approach to keyword rich HTML titles bad? Where possible I try to include the top key phrase to descripe a page and then a second top keyphrase describing what the company/ site as a whole is or does. For instance an estate agents site could consist of HTML title such as this Buy Commercial Property in Birmingham| Commercial Estate Agents Birmingham Commercial Property Tips | Commercial Estate Agents In order to preserve valuable characters I have also been omitting brand names other than on the home page... is this also poor form?
Intermediate & Advanced SEO | | SoundinTheory0 -
Best practice with duplicate content. Cd
Our website has recently been updated, now it seems that all of our products pages look like this cdnorigin.companyname.com/catagory/product Google is showing these pages within the search. rather then companyname.com/catagory/product Each product page does have a canaonacal tag on that points to the cdnorigin page. Is this best practice? i dont think that cdnorigin.companyname etc looks very goon in the search. is there any reason why my designer would set the canonical tags up this way?
Intermediate & Advanced SEO | | Alexogilvie0 -
Duplicate content issue for franchising business
Hi All We are in the process of adding a franchise model to our exisitng stand alone business and as part of the package given to the franchisee will be a website with conent identical to our existing website apart from some minor details such as contact and address details. This creates a huge duplicate content issue and even if we implement a cannonical approach to this will still be unfair to the franchisee in terms of their markeitng and own SEO efforts. The url for each franchise will be unique but the content will be the same to a large extend. The nature of the service we offer (professional qualificaitons) is such that the "products" can only be described in a certain way and it will be near on in impossible to have a unique set of "product" pages for each franchisee. I hope that some of you have come across a similar problem or that some of you have suggestions or ideas for us to get round this. Kind regards Peter
Intermediate & Advanced SEO | | masterpete0 -
Handling Similar page content on directory site
Hi All, SEOMOZ is telling me I have a lot of duplicate content on my site. The pages are not duplicate, but very similar, because the site is a directory website with a page for cities in multiple states in the US. I do not want these pages being indexed and was wanting to know the best way to go about this. I was thinking I could do a rel ="nofollow" on all the links to those pages, but not sure if that is the correct way to do this. Since the folders are deep within the site and not under one main folder, it would mean I would have to do a disallow for many folders if I did this through Robots.txt. The other thing I am thinking of is doing a meta noindex, follow, but I would have to get my programmer to add a meta tag just for this section of the site. Any thoughts on the best way to achieve this so I can eliminate these dup pages from my SEO report and from the search engine index? Thanks!
Intermediate & Advanced SEO | | cchhita0 -
Duplicate Content
Hi everyone, I have a TLD in the UK with a .co.uk and also the same site in Ireland (.ie). The only differences are the prices and different banners maybe. The .ie site pulls all of the content from the .co.uk domain. Is this classed as content duplication? I've had problems in the past in which Google struggles to index the website. At the moment the site appears completely fine in the UK SERPs but for Ireland I just have the Title and domain appearing in the SERPs, with no extended title or description because of the confusion I caused Google last time. Does anybody know a fix for this? Thanks
Intermediate & Advanced SEO | | royb0 -
How to resolve Duplicate Page Content issue for root domain & index.html?
SEOMoz returns a Duplicate Page Content error for a website's index page, with both domain.com and domain.com/index.html isted seperately. We had a rewrite in the htacess file, but for some reason this has not had an impact and we have since removed it. What's the best way (in an HTML website) to ensure all index.html links are automatically redirected to the root domain and these aren't seen as two separate pages?
Intermediate & Advanced SEO | | ContentWriterMicky0