Is My Boilerplate Product Description Causing Duplicate Content Issues?
-
I have an e-commerce store with 20,000+ one-of-a-kind products. We only have one of each product, and once a product is sold we will never restock it. So I really have no intention to have these product pages showing up in SERPs. Each product has a boilerplate description that the product's unique attributes (style, color, size) are plugged into. But a few sentences of the description are exactly the same across all products.
Google Webmaster Tools doesn't report any duplicate content. My Moz Crawl Report show 29 of these products as having duplicate content. But a Google search using the site operator and some text from the boilerplate description turns up 16,400 product pages from my site.
Could this duplicate content be hurting my SERPs for other pages on the site that I am trying to rank? As I said, I'm not concerned about ranking for these products pages. Should I make them "rel=canonical" to their respective product categories? Or use "noindex, follow" on every product? Or should I not worry about it?
-
My SERPs for a competitive term I felt I underperforming for dropped about 10 spots overnight after I added "noindex,follow" to the product pages. From the 3rd page to the 4th page, so it's not like I had a lot to lose. My SERPs for less competitive long tail keywords, which is where I'm getting most of my traffic, have dropped slightly or stayed the same.
Should I cross my fingers and hope for a recovery? Revert the product pages back to "index, follow"? Any thoughts?
-
Hi Tom,
Thanks so much for the thorough response.
Based on several comparative metrics with sites that are outranking me significantly, I do feel the site is underperforming. Because our traffic is ridiculously seasonal the Panguin Tool doesn't provide any clues.
I just added to all my products using Yoast's Wordpress SEO plugin. We'll see what happens.
Thanks,
Zach -
Hi Zachary
I really can't be sure if it's having an adverse affect, but I wouldn't be surprised if it was.
Having looked at just 3 of the product pages, there is a problem with content being repeated, but I think it is being compounded by there being no other content on the page either to make it look unique.
Both are the hallmarks to a potential Panda penalty, which could affect the pages performance themselves and/or the whole domain. So, if you're seeing subpar performance (and even if you're performing well it's worth reading on) I would look at the following solution.
For every product that you do not intend to restock or reuse, I would either add a tag, add a 301 redirect or simply remove the page and serve a 404. If we're talking tens of thousands, then having that many redirects might bloat out your .htaccess file (making it larger and longer to load/process) and having an instant drop of 20k URLs and 404 errors might look a bit odd to Google as well. However, adding 20k tags is a bit of a nightmare as well.
You might want to try a combination of all 3 - a few 404 errors is nothing to worry about - but the logic is that you will be removing a number of pages that have this duplicate content on it, thus improving the quality of the domain. For your remaining 'live' pages, I'd highly recommend taking the time to add 200+ words of unique content about the product in order to avoid this happening again.
An alternative solution would be to block the bots from accessing the /shop/ subfolder in your robots.txt file - and then setting up the shop and the currently active product listings on a different subdomain. You'd lose the ability to use the /shop/ folder, but it would be quicker than manually adding tags or 301 redirects.
That's the method I would use if I wanted to address the issue. However, this **may not be necessary **if your site is not performing badly. You can check this to a degree with the Panguin Tool - this overlays your organic traffic in analytics with Google updates - if a drop in traffic coincides with a Panda penalty, you may be under the affect of one - in which case you should take action ASAP.
Hope this helps.
-
Hi Zachery
29 pages showing up in your Moz crawl report out of 16,400 indexed pages on your site is such a small percentage (0.18% to be accurate) it is not worth worrying about. Also, if GWT is not reporting any issues I think you should be fine.
Don't worry, be happy!
Peter
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate content issues... en-gb V en-us
Hi everyone, I have a global client with lots of duplicate page issues, mainly because they have duplicate pages for US, UK and AUS... they do this because they don't offer all services in all markets, and of course, want to show local contact details for each version. What is the best way to handle this for SEO as clearly I want to rank the local pages for each country. Cheers
Technical SEO | | Algorhythm_jT0 -
Product Variations (rel=canonical or 301) & Duplicate Product Descriptions
Hi All, Hoping for a bit of advice here please, I’ve been tasked with building an e-commerce store and all is going well so far. We decided to use Wordpress with Woocommerce as our shop plugin. I’ve been testing the CSV import option for uploading all our products and I’m a little concerned on two fronts: - Product Variations Duplicate content within the product descriptions **Product Variations: - ** We are selling furniture that has multiple variations (see list below) and as a result it creates c.50 product variations all with their own URL’s. Facing = Left, Right Leg style = Round, Straight, Queen Ann Leg colour = Black, White, Brown, Wood Matching cushion = Yes, No So my question is should I 301 re-direct the variation URL’s to the main product URL as from a user perspective they aren't used (we don't have images for each variation that would trigger the URL change, simply drop down options for the user to select the variation options) or should I add the rel canonical tag to each variation pointing back to the main product URL. **Duplicate Content: - ** We will be selling similar products e.g. A chair which comes in different fabrics and finishes, but is basically the same product. Most, if not all of the ‘long’ product descriptions are identical with only the ‘short’ product descriptions being unique. The ‘long’ product descriptions contain all the manufacturing information, leg option/colour information, graphics, dimensions, weight etc etc. I’m concerned that by having 300+ products all with identical ‘long’ descriptions its going to be seen negatively by google and effect the sites SEO. My question is will this be viewed as duplicate content? If so, are there any best practices I should be following for handling this, other than writing completely unique descriptions for each product, which would be extremely difficult given its basically the same products re-hashed. Many thanks in advance for any advice.
Technical SEO | | Jon-S0 -
Duplicate content and rel canonicals?
Hi. I have a question relating to 2 sites that I manage with regards to duplicate content. These are 2 separate companies but the content is off a data base from the one(in other words the same). In terms of the rel canonical, how would we do this so that google does not penalise either site but can also have the content to crawl for both or is this just a dream?
Technical SEO | | ProsperoDigital0 -
Duplicate Content Due to Pagination
Recently our newly designed website has been suffering from a rankings loss. While I am sure there are a number of factors involved, I'd like to no if this scenario could be harmful... Google is showing a number of duplicate content issues within Webmaster Tools. Some of what I am seeing is duplicate Meta Titles and Meta Descriptions for page 1 and page 2 of some of my product category pages. So if a category has many products and has 4 pages, it is effectively showing the same page title and meta desc. across all 4 pages. I am wondering if I should let my site show, say 150 products per page to get them all on one page instead of the current 36 per page. I use the Big Commerce platform. Thank you for taking the time to read my question!
Technical SEO | | josh3300 -
Duplicate page content
Hello, My site is being checked for errors by the PRO dashboard thing you get here and some odd duplicate content errors have appeared. Every page has a duplicate because you can see the page and the page/~username so... www.short-hairstyles.com is the same as www.short-hairstyles.com/~wwwshor I don't know if this is a problem or how the crawler found this (i'm sure I have never linked to it). But I'd like to know how to prevent it in case it is a problem if anyone knows please? Ian
Technical SEO | | jwdl0 -
Duplicate Content on Navigation Structures
Hello SEOMoz Team, My organization is making a push to have a seamless navigation across all of its domains. Each of the domains publishes distinctly different content about various subjects. We want each of the domains to have its own separate identity as viewed by Google. It has been suggested internally that we keep the exact same navigation structure (40-50 links in the header) across the header of each of our 15 domains to ensure "unity" among all of the sites. Will this create a problem with duplicate content in the form of the menu structure, and will this cause Google to not consider the domains as being separate from each other? Thanks, Richard Robbins
Technical SEO | | LDS-SEO0 -
Thin/Duplicate Content
Hi Guys, So here's the deal, my team and I just acquired a new site using some questionable tactics. Only about 5% of the entire site is actually written by humans the rest of the 40k + (and is increasing by 1-2k auto gen pages a day)pages are all autogen + thin content. I'm trying to convince the powers that be that we cannot continue to do this. Now i'm aware of the issue but my question is what is the best way to deal with this. Should I noindex these pages at the directory level? Should I 301 them to the most relevant section where actual valuable content exists. So far it doesn't seem like Google has caught on to this yet and I want to fix the issue while not raising any more red flags in the process. Thanks!
Technical SEO | | DPASeo0