How to Fix Duplicate Page Content?
-
Our latest SEOmoz crawl reports 1138 instances of "duplicate page content."
I have long been aware that our duplicate page content is likely a major reason Google has de-valued our Web store.
Our duplicate page content is the result of the following:
1. We sell audio books and use the publisher's description (narrative) of the title. Google is likely recognizing the publisher as the owner / author of the description and our description as duplicate content.
2. Many audio book titles are published in more than one format (abridged, unabridged CD, and/or unabridged MP3) by the same publisher so the basic description on our site would be the same at our Web store for each format = more duplicate content at our Web store.
Here's are two examples (one abridged, one unabridged) of one title at our Web store.
How much would the body content of one of the above pages have to change so that a SEOmoz crawl does NOT say the content is duplicate?
-
Just wanted to add a note that our tools do not detect duplicates across domains or on other websites, so these warnings are completely tied to your own pages/URLs.
These are "near" duplicates in our view, and Takeshi is right - there are many possible solutions. I'm guessing you can't directly combine them, from an e-commerce standpoint, but I would suggest either making a "parent" page and using rel=canonical, or just making sure there's navigation between the formats/versions and then pointing rel=canonical to the most common version (i.e. that your customers buy).
Technically, this will remove one version from ranking consideration, but I think that's preferable to having 100s or 1000s of versions out there and diluting your ranking ability or even having Panda-related problems. It's one thing if you have Amazon's link profile, but the rest of us aren't so lucky.
-
Good question. The canonical tag may be part of our solution.
I am also planning on having a "main" product with the description and any variations (abridged, unabridged, CD, MP3 CD) as subproducts which would use the main products' description. I.E. There would only be one product page with the description, not multiple. This will still result in our main products' page having the same description as the publisher. We have 1000s of audio products. Paying someone or doing it ourselves to create enough unique content on these pages would be prohibitive. Some high ranking competitors of ours have the same description as the publisher so Google must be taking something else into consideration to value them much higher than us.
-
They are saying the pages on your site have duplicate content. Those two pages you linked are a perfect example. The content is exactly the same minus two words, which is more than enough for Google to register it as duplicate..
What I don't understand is what's wrong with a simple canonical tag in this instance? Do you really need both of these indexed?
-
When SEOmoz identifies pages at our Web store with duplicate content is SEOmoz saying one of both of the following:
1. More than one page at our Web store has the same content.
2. One or more pages at our Web store has the same content as another page on the Web.
-
Agreed with everything Takeshi just said, but only left out one thing. Once you combine pages, make sure to 301 redirect the old pages to the new url. If you don't want to combine remember to use rel=canonical to delineate which type of permalink has the authority.
Hope that helps.
-
There are no easy fixes here. Here are a few things that are common practice among etailers to reduce duplicate content:
- Combine similar pages into one. So abridged & unabridged would be on one page, with a drop-down menu to select the different versions of the product.
- Re-write the product descriptions, from scratch (you can hire people to do this).
- Add your own unique content in addition to the provided description, such editorial reviews, recommendations, historical information, product specs, etc.
- Add user reviews, so that users can generate unique content for you.
- Create a unique user experience that improves the shopping experience on your site. Why should a user shop at your store, and not Amazon? Why should Google rank your site above Amazon? What differentiates you?
Like I said, there are no quick fixes for unique content. You either have to re-write the descriptions, add your own unique content, or both.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Minimum amount of content for Ecommerce pages?
Hi Guys, Currently optimizing my e-commerce store which currently has around 100 words of content on average for each category page. Based on this study by Backlinko the more content the better: http://backlinko.com/wp-content/uploads/2016/01/02_Content-Total-Word-Count_line.png Would you say this is true for e-commerce pages, for example, a page like this: http://www.theiconic.com.au/yoga-pants/ What benefits would you receive with adding more content? Is it basically more content, leads to more potential long-tail opportunity and more organic traffic? Assuming the content is solid and not built just for SEO reasons. Cheers.
Intermediate & Advanced SEO | | seowork2140 -
Wondering if creating 256 new pages would cause duplicate content issues
I just completed a long post that reviews 16 landing page tools. I want to add 256 new pages that compare each tool against each other. For example: Leadpages vs. Instapage Leadpages vs. Unbounce Instapage vs. Unbounce, etc Each page will have one product's information on the left and the other on the right. So each page will be a unique combination BUT the same product information will be found on several other pages (its other comparisons vs the other 15 tools). This is because the Leadpages comparison information (a table) will be the same no matter which tool it is being compared against. If my math is correct, this will create 256 new pages - one for each combination of the 16 tools against each other! My site now is new and only has 6 posts/pages if that matters. Want to make sure I don't create a problem early on...Any thoughts?
Intermediate & Advanced SEO | | martechwiz0 -
Duplicate Page Due To Website Display Function
Hi Can anyone help with how I can rectify a duplicate issue? A high priority on my Moz report shows a duplicate issue however, this is due to the way the website is structured. For example. the below duplicate is created due to the website having a function to display all trips, so customers do not need to search page by page i.e: http://www.bikecation.co.uk/categories/cycling-climbs http://www.bikecation.co.uk/categories/cycling-climbs/page/2?showall=1 My question is, Will this format damage the SEO for this page? Is there a way to rectify? Would a canonical tag work in this case? Many Thanks Claire
Intermediate & Advanced SEO | | Strateji0 -
How to 301 Redirect /page.php to /page, after a RewriteRule has already made /page.php accessible by /page (Getting errors)
A site has its URLs with php extensions, like this: example.com/page.php I used the following rewrite to remove the extension so that the page can now be accessed from example.com/page RewriteCond %{REQUEST_FILENAME}.php -f
Intermediate & Advanced SEO | | rcseo
RewriteRule ^(.*)$ $1.php [L] It works great. I can access it via the example.com/page URL. However, the problem is the page can still be accessed from example.com/page.php. Because I have external links going to the page, I want to 301 redirect example.com/page.php to example.com/page. I've tried this a couple of ways but I get redirect loops or 500 internal server errors. Is there a way to have both? Remove the extension and 301 the .php to no extension? By the way, if it matters, page.php is an actual file in the root directory (not created through another rewrite or URI routing). I'm hoping I can do this, and not just throw a example.com/page canonical tag on the page. Thanks!0 -
Publishing pages with thin content, update later?
So I have about 285 pages I created with very, very thin content on each. Each is unique, and each serves its own purpose. My question is, do you guys think it is wise to publish all of these at once to just get them out there and update each as we go along? Each page is very laser targeted and I anticipate that a large handful will actually rank soon after publishing. Thanks! Tom
Intermediate & Advanced SEO | | TomBinga11250 -
How to Best Establish Ownership when Content is Duplicated?
A client (Website A) has allowed one of their franchisees to use some of the content from their site on the franchisee site (Website B). This franchisee lifted the content word for word, so - my question is how to best establish that Website A is the original author? Since there is a business relationship between the two sites, I'm thinking of requiring Website B to add a rel=canonical tag to each page using the duplicated content and referencing the original URL on site A. Will that work, or is there a better solution? This content is primarily informational product content (not blog posts or articles), so I'm thinking rel=author may not be appropriate.
Intermediate & Advanced SEO | | Allie_Williams0 -
Which duplicate content should I remove?
I have duplicate content and am trying to figure out which URL to remove. What should I take into consideration? Authority? How close to the root the page is? How clear the path is? Would appreciate your help! Thanks!
Intermediate & Advanced SEO | | Ocularis0 -
Could you use a robots.txt file to disalow a duplicate content page from being crawled?
A website has duplicate content pages to make it easier for users to find the information from a couple spots in the site navigation. Site owner would like to keep it this way without hurting SEO. I've thought of using the robots.txt file to disallow search engines from crawling one of the pages. Would you think this is a workable/acceptable solution?
Intermediate & Advanced SEO | | gregelwell0