Duplicate Content on Product Pages
-
I'm getting a lot of duplicate content errors on my ecommerce site www.outdoormegastore.co.uk mainly centered around product pages.
The products are completely different in terms of the title, meta data, product descriptions and images (with alt tags)but SEOmoz is still identifying them as duplicates and we've noticed a significant drop in google ranking lately.
Admittedly the product descriptions are a little bit thin but I don't understand why the pages would be viewed as duplicates and therefore can be ranked lower? The content is definitely unique too.
As an example these three pages have been identified as being duplicates of each other.
http://www.outdoormegastore.co.uk/regatta-landtrek-25l-rucksack.html
http://www.outdoormegastore.co.uk/canyon-bryce-adult-cycling-helmet-9045.html
http://www.outdoormegastore.co.uk/outwell-minnesota-6-carpet-for-green-07-08-tent.html
-
No, I don't have exact match domain. But, I have added duplicate content on too many product pages since last two week. I believe that, Google have crawled all product pages which contain duplicate content. And, I am getting issue regarding ranking.
-
It maybe the duplicate content, but you say last 3 days?
You don't have a exact match domain do you?
-
Hi!
Actually you are doing well, as creating original product pages is the best option in order to avoid duplicated content issues.
I want to discuss more on this sentence. I am trying to establish user friendly experience on my eCommerce website by adding specific product description from manufacturer website to my website.
But, I am getting issue in ranking since last 3 days. My major keywords Office Chairs, Patio Umbrellas & Table Lamps are going down around -10 position. I have drill down more on it and come to know about this issue.
I have added product description from manufacturer website to my website in these category products. I am 100% sure, Google is giving rank drop due to this issue.
-
Hi!
Actually you are doing well, as creating original product pages is the best option in order to avoid duplicated content issues.
If I was you, I would eventually see how to implement a UGC review option, in order to have product pages becoming even more unique with the pass of time.
About Overstock... sincerely I cannot give you an answer why it doesn't seem as suffering consequences for its duplications. It would be needed a deeper investigation, for which I don't have the time right now.
-
This is very good discussion on duplicate content. Ranking is going down on my website since last 3 days.
I have done drill down with SEO latest update and come to know about duplicate content on product pages.
My website contain true duplicate content on product page. But, I have found duplicate content on competitor website. (overstock)
They are not getting any issue regarding ranking. But, my certain category level page getting issue with ranking. I have added true duplicate in more than 2000 pages from manufacturer website to my website.
I am going to recover it by removing duplicate content or adding unique content on product page. Is there any additional inputs from SEOmoz users?
Competitor website:
My website:
Manufacturer website:
-
Hi Gavin,
Just to clarify, SEOmoz flags your content as duplicate if finds 95% HTML similarity. You can use an online tool to compare pages yourself. I like this one:
http://www.webconfs.com/similar-page-checker.php
Google obviously uses a more sophisticated method than Moz, but it's still a good warning because pages without much unique content - even if they aren't true duplicates - often have a difficult time ranking for their targeted keywords.
-
Good catch!
-
Just body.
You need a product template, this will make it easier. If you visit any major eCommerce website you will see every product has the same layout.
So something like...
Title of product > Short description > spec > FAQ's > etc
This is just an answer to a question on a forum and looks like a 100 or so words right here, you could have a FAQ's section on the products and just make the questions up and answer them.
http://uk.answers.yahoo.com/question/index?qid=20090904093654AA2XDud
Always ways of creating content just need to have a good think and something will come up.
-
Thanks for the reply..
Would the 300 unique words be spread across just body content or would it include meta words too?
It's going to be difficult describing tent pegs in 300 words or more!
-
Hello
The reason these are coming up as duplicate content is due to the thin content. You need at least 300 unique words on each page to make good content and as you dont have as many words on these pages it is classing as a duplicate.
If you add more to each description then this will change and hopefully your rankings.
Good luck
-
The thin content you do have is on multiple pages on your domain.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate content created by website Calendar - A Penalty?
A colleague of mine asked me a question about duplicate content coming from their event calendar. I don't think this will affect them negatively, but I would love some feedback and thoughts. ThanksOne of my clients, LifeTech Academy, is using my RavenTools software. Raventools has reported a HUGE amount of duplicate content (4.4K instances).The duplicate content all revolves around their calendar and repeating events (http://lifetechacademy.org/events/)The question is this - will this impact their SEO efforts in a negative way?
Intermediate & Advanced SEO | | Bill_K0 -
Geographic site clones and duplicate content penalties
We sell wedding garters, niche I know! We have a site (weddinggarterco.com) that ranks very well in the UK and sell a lot to the USA despite it's rudimentary currency functions (Shopify makes US customers checkout in £gbp; not helpful to conversions). To improve this I built a clone (theweddinggarterco.com) and have faked a kind of location selector top right. Needless to say a lot of content on this site is VERY similar to the UK version. My questions are... 1. Is this likely to stop me ranking the USA site? 2. Is this likely to harm my UK rankings? Any thoughts very welcome! Thanks. Mat
Intermediate & Advanced SEO | | mat20150 -
Concerns of Duplicative Content on Purchased Site
Recently I purchased a site of 50+ DA (oldsite.com) that had been offline/404 for 9-12 months from the previous owner. The purchase included the domain and the content previously hosted on the domain. The backlink profile is 100% contextual and pristine. Upon purchasing the domain, I did the following: Rehosted the old site and content that had been down for 9-12 months on oldsite.com Allowed a week or two for indexation on oldsite.com Hosted the old content on my newsite.com and then performed 100+ contextual 301 redirects from the oldsite.com to newsite.com using direct and wild card htaccess rules Issued a Press Release declaring the acquisition of oldsite.com for newsite.com Performed a site "Change of Name" in Google from oldsite.com to newsite.com Performed a site "Site Move" in Bing/Yahoo from oldsite.com to newsite.com It's been close to a month and while organic traffic is growing gradually, it's not what I would expect from a domain with 700+ referring contextual domains. My current concern is around original attribution of content on oldsite.com shifting to scraper sites during the year or so that it was offline. For Example: Oldsite.com has full attribution prior to going offline Scraper sites scan site and repost content elsewhere (effort unsuccessful at time because google know original attribution) Oldsite.com goes offline Scraper sites continue hosting content Google loses consumer facing cache from oldsite.com (and potentially loses original attribution of content) Google reassigns original attribution to a scraper site Oldsite.com is hosted again and Google no longer remembers it's original attribution and thinks content is stolen Google then silently punished Oldsite.com and Newsite.com (which it is redirected to) QUESTIONS Does this sequence have any merit? Does Google keep track of original attribution after the content ceases to exist in Google's search cache? Are there any tools or ways to tell if you're being punished for content being posted else on the web even if you originally had attribution? Unrelated: Are there any other steps that are recommend for a Change of site as described above.
Intermediate & Advanced SEO | | PetSite0 -
Duplicate content within sections of a page but not full page duplicate content
Hi, I am working on a website redesign and the client offers several services and within those services some elements of the services crossover with one another. For example, they offer a service called Modelling and when you click onto that page several elements that build up that service are featured, so in this case 'mentoring'. Now mentoring is common to other services therefore will feature on other service pages. The page will feature a mixture of unique content to that service and small sections of duplicate content and I'm not sure how to treat this. One thing we have come up with is take the user through to a unique page to host all the content however some features do not warrant a page being created for this. Another idea is to have the feature pop up with inline content. Any thoughts/experience on this would be much appreciated.
Intermediate & Advanced SEO | | J_Sinclair0 -
Duplicate on page content - Product descriptions - Should I Meta NOINDEX?
Hi, Our e-commerce store has a lot of product descriptions duplicated - Some of them are default manufacturer descriptions, some are descriptions because the colour of the product varies - so essentially the same product, just different colour. It is going to take a lot of man hours to get the unique content in place - would a Meta No INDEX on the dupe pages be ok for the moment and then I can lift that once we have unique content in place? I can't 301 or canonicalize these pages, as they are actually individual products in their own right, just dupe descriptions. Thanks, Ben
Intermediate & Advanced SEO | | bjs20101 -
Duplicate content issue for franchising business
Hi All We are in the process of adding a franchise model to our exisitng stand alone business and as part of the package given to the franchisee will be a website with conent identical to our existing website apart from some minor details such as contact and address details. This creates a huge duplicate content issue and even if we implement a cannonical approach to this will still be unfair to the franchisee in terms of their markeitng and own SEO efforts. The url for each franchise will be unique but the content will be the same to a large extend. The nature of the service we offer (professional qualificaitons) is such that the "products" can only be described in a certain way and it will be near on in impossible to have a unique set of "product" pages for each franchisee. I hope that some of you have come across a similar problem or that some of you have suggestions or ideas for us to get round this. Kind regards Peter
Intermediate & Advanced SEO | | masterpete0 -
Removing Duplicate Page Content
Since joining SEOMOZ four weeks ago I've been busy tweaking our site, a magento eCommerce store, and have successfully removed a significant portion of the errors. Now I need to remove/hide duplicate pages from the search engines and I'm wondering what is the best way to attack this? Can I solve this in one central location, or do I need to do something in the Google & Bing webmaster tools? Here is a list of duplicate content http://www.unitedbmwonline.com/?dir=asc&mode=grid&order=name http://www.unitedbmwonline.com/?dir=asc&mode=list&order=name
Intermediate & Advanced SEO | | SteveMaguire
http://www.unitedbmwonline.com/?dir=asc&order=name http://www.unitedbmwonline.com/?dir=desc&mode=grid&order=name http://www.unitedbmwonline.com/?dir=desc&mode=list&order=name http://www.unitedbmwonline.com/?dir=desc&order=name http://www.unitedbmwonline.com/?mode=grid http://www.unitedbmwonline.com/?mode=list Thanks in advance, Steve0 -
Subdomains - duplicate content - robots.txt
Our corporate site provides MLS data to users, with the end goal of generating leads. Each registered lead is assigned to an agent, essentially in a round robin fashion. However we also give each agent a domain of their choosing that points to our corporate website. The domain can be whatever they want, but upon loading it is immediately directed to a subdomain. For example, www.agentsmith.com would be redirected to agentsmith.corporatedomain.com. Finally, any leads generated from agentsmith.easystreetrealty-indy.com are always assigned to Agent Smith instead of the agent pool (by parsing the current host name). In order to avoid being penalized for duplicate content, any page that is viewed on one of the agent subdomains always has a canonical link pointing to the corporate host name (www.corporatedomain.com). The only content difference between our corporate site and an agent subdomain is the phone number and contact email address where applicable. Two questions: Can/should we use robots.txt or robot meta tags to tell crawlers to ignore these subdomains, but obviously not the corporate domain? If question 1 is yes, would it be better for SEO to do that, or leave it how it is?
Intermediate & Advanced SEO | | EasyStreet0