Is My Boilerplate Product Description Causing Duplicate Content Issues?
-
I have an e-commerce store with 20,000+ one-of-a-kind products. We only have one of each product, and once a product is sold we will never restock it. So I really have no intention to have these product pages showing up in SERPs. Each product has a boilerplate description that the product's unique attributes (style, color, size) are plugged into. But a few sentences of the description are exactly the same across all products.
Google Webmaster Tools doesn't report any duplicate content. My Moz Crawl Report show 29 of these products as having duplicate content. But a Google search using the site operator and some text from the boilerplate description turns up 16,400 product pages from my site.
Could this duplicate content be hurting my SERPs for other pages on the site that I am trying to rank? As I said, I'm not concerned about ranking for these products pages. Should I make them "rel=canonical" to their respective product categories? Or use "noindex, follow" on every product? Or should I not worry about it?
-
My SERPs for a competitive term I felt I underperforming for dropped about 10 spots overnight after I added "noindex,follow" to the product pages. From the 3rd page to the 4th page, so it's not like I had a lot to lose. My SERPs for less competitive long tail keywords, which is where I'm getting most of my traffic, have dropped slightly or stayed the same.
Should I cross my fingers and hope for a recovery? Revert the product pages back to "index, follow"? Any thoughts?
-
Hi Tom,
Thanks so much for the thorough response.
Based on several comparative metrics with sites that are outranking me significantly, I do feel the site is underperforming. Because our traffic is ridiculously seasonal the Panguin Tool doesn't provide any clues.
I just added to all my products using Yoast's Wordpress SEO plugin. We'll see what happens.
Thanks,
Zach -
Hi Zachary
I really can't be sure if it's having an adverse affect, but I wouldn't be surprised if it was.
Having looked at just 3 of the product pages, there is a problem with content being repeated, but I think it is being compounded by there being no other content on the page either to make it look unique.
Both are the hallmarks to a potential Panda penalty, which could affect the pages performance themselves and/or the whole domain. So, if you're seeing subpar performance (and even if you're performing well it's worth reading on) I would look at the following solution.
For every product that you do not intend to restock or reuse, I would either add a tag, add a 301 redirect or simply remove the page and serve a 404. If we're talking tens of thousands, then having that many redirects might bloat out your .htaccess file (making it larger and longer to load/process) and having an instant drop of 20k URLs and 404 errors might look a bit odd to Google as well. However, adding 20k tags is a bit of a nightmare as well.
You might want to try a combination of all 3 - a few 404 errors is nothing to worry about - but the logic is that you will be removing a number of pages that have this duplicate content on it, thus improving the quality of the domain. For your remaining 'live' pages, I'd highly recommend taking the time to add 200+ words of unique content about the product in order to avoid this happening again.
An alternative solution would be to block the bots from accessing the /shop/ subfolder in your robots.txt file - and then setting up the shop and the currently active product listings on a different subdomain. You'd lose the ability to use the /shop/ folder, but it would be quicker than manually adding tags or 301 redirects.
That's the method I would use if I wanted to address the issue. However, this **may not be necessary **if your site is not performing badly. You can check this to a degree with the Panguin Tool - this overlays your organic traffic in analytics with Google updates - if a drop in traffic coincides with a Panda penalty, you may be under the affect of one - in which case you should take action ASAP.
Hope this helps.
-
Hi Zachery
29 pages showing up in your Moz crawl report out of 16,400 indexed pages on your site is such a small percentage (0.18% to be accurate) it is not worth worrying about. Also, if GWT is not reporting any issues I think you should be fine.
Don't worry, be happy!
Peter
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to fix duplicate content caused by tags?
I use SEMRush, and the issue they are finding is I have 30 duplicate content issues. All seem to be caused by the tags I add in my portfolio pieces. I have looked at my SEO settings (taxonomies, etc) in the Wordpress site, and don't know what I am doing wrong....any advice how to fix? I have attached a screen shot VsYv2wY
Technical SEO | | cschwartzel0 -
174 Duplicate Content Errors
How do I go about fixing these errors? There are all related to my tags. Thank you in advance for any help! Lisa
Technical SEO | | lisarein0 -
Duplicate Content Issue
My issue with duplicate content is this. There are two versions of my website showing up http://www.example.com/ http://example.com/ What are the best practices for fixing this? Thanks!
Technical SEO | | OOMDODigital0 -
How to solve Parameter Issue causing Duplicate Content
Hi everyone, My site home page comes up in SERP with following url www.sitename/?referer=indiagrid My question is:- Should I disallow using robots.txt.? or 301 redirect to the home page Other issue is i have few dynamic generated URL's for a form http://www.www.sitename/career-form.php?position=SEO Executive I am using parameter "position" in URL Parameter in GWT. But still my pages are indexed that is leading to duplicate page content. Please help me out.
Technical SEO | | himanshu3019890 -
Duplicate Footer Content
A client I just took over is having some duplicate content issues. At the top of each page he has about 200 words of unique content. Below this is are three big tables of text that talks about his services, history, etc. This table is pulled into the middle of every page using php. So, he has the exact same three big table of text across every page. What should I do to eliminate the dup content. I thought about removing the script then just rewriting the table of text on every page... Is there a better solution? Any ideas would be greatly appreciated. Thanks!
Technical SEO | | BigStereo0 -
Duplicate Content on Navigation Structures
Hello SEOMoz Team, My organization is making a push to have a seamless navigation across all of its domains. Each of the domains publishes distinctly different content about various subjects. We want each of the domains to have its own separate identity as viewed by Google. It has been suggested internally that we keep the exact same navigation structure (40-50 links in the header) across the header of each of our 15 domains to ensure "unity" among all of the sites. Will this create a problem with duplicate content in the form of the menu structure, and will this cause Google to not consider the domains as being separate from each other? Thanks, Richard Robbins
Technical SEO | | LDS-SEO0 -
How critical is Duplicate content warnings?
Hi, So I have created my first campaign here and I have to say the tools, user interface and the on-page optimization, everything is useful and I am happy with SEOMOZ. However, the crawl report returned thousands of errors and most of them are duplicate content warnings. As we use Drupal as our CMS, the duplicate content is caused by Drupal's pagination problems. Let's say there is a page called "/top5list" , the crawler decided /top5list?page=1" to be duplicate of "/top5list". There is no real solution for pagination problems in Drupal (as far as I know). I don't have any warnings in Google's webmaster tools regarding this and my sitemap I submitted to Google doesn't include those problematic deep pages. (that are detected as duplicate content by SEOMOZ crawler) So my question is, should I be worried about the thousands of error messages in crawler diagnostics? any ideas appreciated
Technical SEO | | Gamer070 -
Duplicate content by category name change
Hello friends, I have several problems with my website related with duplicate content. When we changed any family name, for example "biodiversidad" to "cajas nido y biodiversidad", it creates a duplicate content because: mydomain.com/biodiversidad and mydomain.com/cajas-nido-y-biodiversidad have the same content. This happens every tame I change the names of the categories or families. To avoid this, the first thing that comes to my mid is a 301 redirect from the old to the new url, but I wonder if this can be done more automatically otherwise, maybe a script? Any suggestion? Thank you
Technical SEO | | pasape0