Duplicate Content on Product Pages
-
I'm getting a lot of duplicate content errors on my ecommerce site www.outdoormegastore.co.uk mainly centered around product pages.
The products are completely different in terms of the title, meta data, product descriptions and images (with alt tags)but SEOmoz is still identifying them as duplicates and we've noticed a significant drop in google ranking lately.
Admittedly the product descriptions are a little bit thin but I don't understand why the pages would be viewed as duplicates and therefore can be ranked lower? The content is definitely unique too.
As an example these three pages have been identified as being duplicates of each other.
http://www.outdoormegastore.co.uk/regatta-landtrek-25l-rucksack.html
http://www.outdoormegastore.co.uk/canyon-bryce-adult-cycling-helmet-9045.html
http://www.outdoormegastore.co.uk/outwell-minnesota-6-carpet-for-green-07-08-tent.html
-
No, I don't have exact match domain. But, I have added duplicate content on too many product pages since last two week. I believe that, Google have crawled all product pages which contain duplicate content. And, I am getting issue regarding ranking.
-
It maybe the duplicate content, but you say last 3 days?
You don't have a exact match domain do you?
-
Hi!
Actually you are doing well, as creating original product pages is the best option in order to avoid duplicated content issues.
I want to discuss more on this sentence. I am trying to establish user friendly experience on my eCommerce website by adding specific product description from manufacturer website to my website.
But, I am getting issue in ranking since last 3 days. My major keywords Office Chairs, Patio Umbrellas & Table Lamps are going down around -10 position. I have drill down more on it and come to know about this issue.
I have added product description from manufacturer website to my website in these category products. I am 100% sure, Google is giving rank drop due to this issue.
-
Hi!
Actually you are doing well, as creating original product pages is the best option in order to avoid duplicated content issues.
If I was you, I would eventually see how to implement a UGC review option, in order to have product pages becoming even more unique with the pass of time.
About Overstock... sincerely I cannot give you an answer why it doesn't seem as suffering consequences for its duplications. It would be needed a deeper investigation, for which I don't have the time right now.
-
This is very good discussion on duplicate content. Ranking is going down on my website since last 3 days.
I have done drill down with SEO latest update and come to know about duplicate content on product pages.
My website contain true duplicate content on product page. But, I have found duplicate content on competitor website. (overstock)
They are not getting any issue regarding ranking. But, my certain category level page getting issue with ranking. I have added true duplicate in more than 2000 pages from manufacturer website to my website.
I am going to recover it by removing duplicate content or adding unique content on product page. Is there any additional inputs from SEOmoz users?
Competitor website:
My website:
Manufacturer website:
-
Hi Gavin,
Just to clarify, SEOmoz flags your content as duplicate if finds 95% HTML similarity. You can use an online tool to compare pages yourself. I like this one:
http://www.webconfs.com/similar-page-checker.php
Google obviously uses a more sophisticated method than Moz, but it's still a good warning because pages without much unique content - even if they aren't true duplicates - often have a difficult time ranking for their targeted keywords.
-
Good catch!
-
Just body.
You need a product template, this will make it easier. If you visit any major eCommerce website you will see every product has the same layout.
So something like...
Title of product > Short description > spec > FAQ's > etc
This is just an answer to a question on a forum and looks like a 100 or so words right here, you could have a FAQ's section on the products and just make the questions up and answer them.
http://uk.answers.yahoo.com/question/index?qid=20090904093654AA2XDud
Always ways of creating content just need to have a good think and something will come up.
-
Thanks for the reply..
Would the 300 unique words be spread across just body content or would it include meta words too?
It's going to be difficult describing tent pegs in 300 words or more!
-
Hello
The reason these are coming up as duplicate content is due to the thin content. You need at least 300 unique words on each page to make good content and as you dont have as many words on these pages it is classing as a duplicate.
If you add more to each description then this will change and hopefully your rankings.
Good luck
-
The thin content you do have is on multiple pages on your domain.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Penalty for duplicate content on the same website?
Is it possible to get a penalty for duplicate content on the same website? I have a old custom-built site with a large number of filters that are pre-generated for speed. Basically the only difference is the meta title and H1 tag, with a few text differences here and there. Obviously I could no-follow all the filter links but it would take an enormous amount of work. The site is performing well in the search. I'm trying to decide whether if there is a risk of a penalty, if not I'm loath to do anything in case it causes other issues.
Intermediate & Advanced SEO | | seoman100 -
Duplicated Content with Index.php
Good Afternoon, My website uses Joomla CMS and has the htaccess rewrite code enabled to ensure the use of search engine friendly URLs (SEF's). While browsing the crawl diagnostics I have found that Moz considers the /index.php URL a duplicate to our root. I will always under the impression that the htaccess rewrite took care of that issue and obviously I would like to address it. I attempted to create a 301 redirect from the index.php URL to the root but ran into an issue when attempting to login to the admin portion of the website as the redirect sent me back to the homepage. I was curious if anyone had advice for handling the index.php duplication issue, specifically with Joomla. Additionally, I have confirmed that in Google Webmasters, under URL parameters, the index.php parameter is set as 'Representative URL'.
Intermediate & Advanced SEO | | BrandonEML0 -
Partial duplicate content and canonical tags
Hi - I am rebuilding a consumer website, and each product page will contain a unique product image, and a sentence or two about the product (and we tend to use a lot of the same words in different ways across products). I'd like to have a tabbed area below the product info that talks about the overall product line, and this content would be duplicate across all the product pages (a "Why use our products" type of thing). I'd have this duplicate content also living on its own URL's so they can be found alone in the SERP's. Question is, do I need to add the canonical tag to this page, since there's partial duplicate content on the product pages? And if I did that, would my product pages go un-indexed?? I understand how to handle completely duplicated content, it's the partial duplicate that I'm having difficulty figuring out.
Intermediate & Advanced SEO | | Jenny10 -
Can a website be punished by panda if content scrapers have duplicated content?
I've noticed recently that a number of content scrapers are linking to one of our websites and have the duplicate content on their web pages. Can content scrapers affect the original website's ranking? I'm concerned that having duplicated content, even if hosted by scrapers, could be a bad signal to Google. What are the best ways to prevent this happening? I'd really appreciate any help as I can't find the answer online!
Intermediate & Advanced SEO | | RG_SEO0 -
Merge content pages together to get one deep high quality content page - good or not !?
Hi, I manage the SEO of a brand poker website that provide ongoing very good content around specific poker tournaments, but all this content is split into dozens of pages in different sections of the website (blog section, news sections, tournament section, promotion section). It seems like today having one deep piece of content in one page has better chance to get mention / social signals / links and therefore get a higher authority / ranking / traffic than if this content was split into dozens of pages. But the poker website I work for and also many other website do generate naturally good content targeting long tail keywords around a specific topic into different section of the website on an ongoing basis. Do you we need once a while to merge those content pages into one page ? If yes, what technical implementation would you advice ? (copy and readjust/restructure all content into one page + 301 the URL into one). Thanks Jeremy
Intermediate & Advanced SEO | | Tit0 -
Duplicate Content From Indexing of non- File Extension Page
Google somehow has indexed a page of mine without the .html extension. so they indexed www.samplepage.com/page, so I am showing duplicate content because Google also see's www.samplepage.com/page.html How can I force google or bing or whoever to only index and see the page including the .html extension? I know people are saying not to use the file extension on pages, but I want to, so please anybody...HELP!!!
Intermediate & Advanced SEO | | WebbyNabler0 -
Adding a huge new product range to eCommerce site and worried about Duplicate Content
Hey all, We currently run a large eCommerce site that has around 5000 pages of content and ranks quite strongly for a lot of key search terms. We have just recently finalised a business agreement to incorporate a new product line that compliments our existing catalogue, but I am concerned about dumping this huge amount of content (that is sourced via an API) onto our site and the effect it might have dragging us down for our existing type of product. In regards to the best way to handle it, we are looking at a few ideas and wondered what SEOMoz thought was the best. Some approaches we are tossing around include: making each page point to the original API the data comes from as the canonical source (not ideal as I don't want to pass link juice from our site to theirs) adding "noindex" to all the new pages so Google simply ignores them and hoping we get side sales onto our existing product instead of trying to rank as the new range is highly competitive (again not ideal as we would like to get whatever organic traffic we can) manually rewriting each and every new product page's descriptions, tags etc. (a huge undertaking in terms of working hours given it will be around 4,400 new items added to our catalogue). Currently the industry standard seems to just be to pull the text from the API and leave it, but doing exact text searches shows that there are literally hundreds of other sites using the exact same duplicate content... I would like to persuade higher management to invest the time into rewriting each individual page but it would be a huge task and be difficult to maintain as changes continually happen. Sorry for the wordy post but this is a big decision that potentially has drastic effects on our business as the vast majority of it is conducted online. Thanks in advance for any helpful replies!
Intermediate & Advanced SEO | | ExperienceOz0 -
Diagnosing duplicate content issues
We recently made some updates to our site, one of which involved launching a bunch of new pages. Shortly afterwards we saw a significant drop in organic traffic. Some of the new pages list similar content as previously existed on our site, but in different orders. So our question is, what's the best way to diagnose whether this was the cause of our ranking drop? My current thought is to block the new directories via robots.txt for a couple days and see if traffic improves. Is this a good approach? Any other suggestions?
Intermediate & Advanced SEO | | jamesti0