Duplicate Content on Product Pages
-
I'm getting a lot of duplicate content errors on my ecommerce site www.outdoormegastore.co.uk mainly centered around product pages.
The products are completely different in terms of the title, meta data, product descriptions and images (with alt tags)but SEOmoz is still identifying them as duplicates and we've noticed a significant drop in google ranking lately.
Admittedly the product descriptions are a little bit thin but I don't understand why the pages would be viewed as duplicates and therefore can be ranked lower? The content is definitely unique too.
As an example these three pages have been identified as being duplicates of each other.
http://www.outdoormegastore.co.uk/regatta-landtrek-25l-rucksack.html
http://www.outdoormegastore.co.uk/canyon-bryce-adult-cycling-helmet-9045.html
http://www.outdoormegastore.co.uk/outwell-minnesota-6-carpet-for-green-07-08-tent.html
-
No, I don't have exact match domain. But, I have added duplicate content on too many product pages since last two week. I believe that, Google have crawled all product pages which contain duplicate content. And, I am getting issue regarding ranking.
-
It maybe the duplicate content, but you say last 3 days?
You don't have a exact match domain do you?
-
Hi!
Actually you are doing well, as creating original product pages is the best option in order to avoid duplicated content issues.
I want to discuss more on this sentence. I am trying to establish user friendly experience on my eCommerce website by adding specific product description from manufacturer website to my website.
But, I am getting issue in ranking since last 3 days. My major keywords Office Chairs, Patio Umbrellas & Table Lamps are going down around -10 position. I have drill down more on it and come to know about this issue.
I have added product description from manufacturer website to my website in these category products. I am 100% sure, Google is giving rank drop due to this issue.
-
Hi!
Actually you are doing well, as creating original product pages is the best option in order to avoid duplicated content issues.
If I was you, I would eventually see how to implement a UGC review option, in order to have product pages becoming even more unique with the pass of time.
About Overstock... sincerely I cannot give you an answer why it doesn't seem as suffering consequences for its duplications. It would be needed a deeper investigation, for which I don't have the time right now.
-
This is very good discussion on duplicate content. Ranking is going down on my website since last 3 days.
I have done drill down with SEO latest update and come to know about duplicate content on product pages.
My website contain true duplicate content on product page. But, I have found duplicate content on competitor website. (overstock)
They are not getting any issue regarding ranking. But, my certain category level page getting issue with ranking. I have added true duplicate in more than 2000 pages from manufacturer website to my website.
I am going to recover it by removing duplicate content or adding unique content on product page. Is there any additional inputs from SEOmoz users?
Competitor website:
My website:
Manufacturer website:
-
Hi Gavin,
Just to clarify, SEOmoz flags your content as duplicate if finds 95% HTML similarity. You can use an online tool to compare pages yourself. I like this one:
http://www.webconfs.com/similar-page-checker.php
Google obviously uses a more sophisticated method than Moz, but it's still a good warning because pages without much unique content - even if they aren't true duplicates - often have a difficult time ranking for their targeted keywords.
-
Good catch!
-
Just body.
You need a product template, this will make it easier. If you visit any major eCommerce website you will see every product has the same layout.
So something like...
Title of product > Short description > spec > FAQ's > etc
This is just an answer to a question on a forum and looks like a 100 or so words right here, you could have a FAQ's section on the products and just make the questions up and answer them.
http://uk.answers.yahoo.com/question/index?qid=20090904093654AA2XDud
Always ways of creating content just need to have a good think and something will come up.
-
Thanks for the reply..
Would the 300 unique words be spread across just body content or would it include meta words too?
It's going to be difficult describing tent pegs in 300 words or more!
-
Hello
The reason these are coming up as duplicate content is due to the thin content. You need at least 300 unique words on each page to make good content and as you dont have as many words on these pages it is classing as a duplicate.
If you add more to each description then this will change and hopefully your rankings.
Good luck
-
The thin content you do have is on multiple pages on your domain.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate Content Issues :(
I am wondering how we can solve our duplicate content issues. Here is the thing: There are so many ways you can write a description about a used watch. http://beckertime.com/product/mens-rolex-air-king-no-date-stainless-steel-watch-wsilver-dial-5500/ http://beckertime.com/product/mens-rolex-air-king-stainless-steel-date-watch-wblue-dial-5500/ Whats different between these two? The dial color. We have a lot of the same model numbers but with different conditions, dial colors, and bands.. What ideas do you have?
Intermediate & Advanced SEO | | KingRosales0 -
How do i prevent Google and Moz from counting pages as duplicates?
I have 130,000 profiles on my site. When not Connected to them they have very few differences. So a bot - not logged in, etc, will see a login form and "Connect to Profilename" MOZ and Google call the links the same, even though theyre unique such as example.com/id/328/name-of-this-group example.com/id/87323/name-of-a-different-group So how do i separate them? Can I use Schema or something to help identify that these are profile pages, or that the content on them should be ignored as its help text, etc? Take facebook - each facebook profile for a name renders simple results: https://www.facebook.com/public/John-Smith https://www.facebook.com/family/Smith/ Would that be duplicate data if facebook had a "Why to join" article on all of those pages?
Intermediate & Advanced SEO | | inmn0 -
Can a website be punished by panda if content scrapers have duplicated content?
I've noticed recently that a number of content scrapers are linking to one of our websites and have the duplicate content on their web pages. Can content scrapers affect the original website's ranking? I'm concerned that having duplicated content, even if hosted by scrapers, could be a bad signal to Google. What are the best ways to prevent this happening? I'd really appreciate any help as I can't find the answer online!
Intermediate & Advanced SEO | | RG_SEO0 -
Need help with duplicate content. Same content; different locations.
We have 2 sites that will have duplicate content (e.g., one company that sells the same products under two different brand names for legal reasons). The two companies are in different geographical areas, but the client will put the same content on each page because they're the same product. What is the best way to handle this? Thanks a lot.
Intermediate & Advanced SEO | | Rocket.Fuel0 -
Penalised for duplicate content, time to fix?
Ok, I accept this one is my fault but wondering on time scales to fix... I have a website and I put an affiliate store on it, using merchant datafeeds in a bid to get revenue from the site. This was all good, however, I forgot to put noindex on the datafeed/duplicate content pages and over a period of a couple of weeks the traffic to the site died. I have since nofollowed or removed the products but some 3 months later my site still will not rank for the keywords it was ranking for previously. It will not even rank if I type in the sites' name (bright tights). I have searched for the name using bright tights, "bright tights" and brighttights but none of them return the site anywhere. I am guessing that I have been hit with a drop x place penalty by Google for the duplicate content. What is the easiest way around this? I have no warning about bad links or the such. Is it worth battling on trying to get the domain back or should I write off the domain, buy a new one and start again but minus the duplicate content? The goal of having the duplicate content store on the site was to be able to rank the category pages in the store which had unique content on so there were no problems with that which I could foresee. Like Amazon et al, the categories would have lists of products (amongst other content) and you would click through to the individual product description - the duplicate page. Thanks for reading
Intermediate & Advanced SEO | | Grumpy_Carl0 -
Mobile Site - Same Content, Same subdomain, Different URL - Duplicate Content?
I'm trying to determine the best way to handle my mobile commerce site. I have a desktop version and a mobile version using a 3rd party product called CS-Cart. Let's say I have a product page. The URLs are... mobile:
Intermediate & Advanced SEO | | grayloon
store.domain.com/index.php?dispatch=categories.catalog#products.view&product_id=857 desktop:
store.domain.com/two-toned-tee.html I've been trying to get information regarding how to handle mobile sites with different URLs in regards to duplicate content. However, most of these results have the assumption that the different URL means m.domain.com rather than the same subdomain with a different address. I am leaning towards using a canonical URL, if possible, on the mobile store pages. I see quite a few suggesting to not do this, but again, I believe it's because they assume we are just talking about m.domain.com vs www.domain.com. Any additional thoughts on this would be great!0 -
Duplicate Content | eBay
My client is generating templates for his eBay template based on content he has on his eCommerce platform. I'm 100% sure this will cause duplicate content issues. My question is this.. and I'm not sure where eBay policy stands with this but adding the canonical tag to the template.. will this work if it's coming from a different page i.e. eBay? Update: I'm not finding any information regarding this on the eBay policy's: http://ocs.ebay.com/ws/eBayISAPI.dll?CustomerSupport&action=0&searchstring=canonical So it does look like I can have rel="canonical" tag in custom eBay templates but I'm concern this can be considered: "cheating" since rel="canonical is actually a 301 but as this says: http://googlewebmastercentral.blogspot.com/2009/12/handling-legitimate-cross-domain.html it's legitimately duplicate content. The question is now: should I add it or not? UPDATE seems eBay templates are embedded in a iframe but the snap shot on google actually shows the template. This makes me wonder how they are handling iframes now. looking at http://www.webmaster-toolkit.com/search-engine-simulator.shtml does shows the content inside the iframe. Interesting. Anyone else have feedback?
Intermediate & Advanced SEO | | joseph.chambers1 -
Subdomains - duplicate content - robots.txt
Our corporate site provides MLS data to users, with the end goal of generating leads. Each registered lead is assigned to an agent, essentially in a round robin fashion. However we also give each agent a domain of their choosing that points to our corporate website. The domain can be whatever they want, but upon loading it is immediately directed to a subdomain. For example, www.agentsmith.com would be redirected to agentsmith.corporatedomain.com. Finally, any leads generated from agentsmith.easystreetrealty-indy.com are always assigned to Agent Smith instead of the agent pool (by parsing the current host name). In order to avoid being penalized for duplicate content, any page that is viewed on one of the agent subdomains always has a canonical link pointing to the corporate host name (www.corporatedomain.com). The only content difference between our corporate site and an agent subdomain is the phone number and contact email address where applicable. Two questions: Can/should we use robots.txt or robot meta tags to tell crawlers to ignore these subdomains, but obviously not the corporate domain? If question 1 is yes, would it be better for SEO to do that, or leave it how it is?
Intermediate & Advanced SEO | | EasyStreet0