Is My Boilerplate Product Description Causing Duplicate Content Issues?
-
I have an e-commerce store with 20,000+ one-of-a-kind products. We only have one of each product, and once a product is sold we will never restock it. So I really have no intention to have these product pages showing up in SERPs. Each product has a boilerplate description that the product's unique attributes (style, color, size) are plugged into. But a few sentences of the description are exactly the same across all products.
Google Webmaster Tools doesn't report any duplicate content. My Moz Crawl Report show 29 of these products as having duplicate content. But a Google search using the site operator and some text from the boilerplate description turns up 16,400 product pages from my site.
Could this duplicate content be hurting my SERPs for other pages on the site that I am trying to rank? As I said, I'm not concerned about ranking for these products pages. Should I make them "rel=canonical" to their respective product categories? Or use "noindex, follow" on every product? Or should I not worry about it?
-
My SERPs for a competitive term I felt I underperforming for dropped about 10 spots overnight after I added "noindex,follow" to the product pages. From the 3rd page to the 4th page, so it's not like I had a lot to lose. My SERPs for less competitive long tail keywords, which is where I'm getting most of my traffic, have dropped slightly or stayed the same.
Should I cross my fingers and hope for a recovery? Revert the product pages back to "index, follow"? Any thoughts?
-
Hi Tom,
Thanks so much for the thorough response.
Based on several comparative metrics with sites that are outranking me significantly, I do feel the site is underperforming. Because our traffic is ridiculously seasonal the Panguin Tool doesn't provide any clues.
I just added to all my products using Yoast's Wordpress SEO plugin. We'll see what happens.
Thanks,
Zach -
Hi Zachary
I really can't be sure if it's having an adverse affect, but I wouldn't be surprised if it was.
Having looked at just 3 of the product pages, there is a problem with content being repeated, but I think it is being compounded by there being no other content on the page either to make it look unique.
Both are the hallmarks to a potential Panda penalty, which could affect the pages performance themselves and/or the whole domain. So, if you're seeing subpar performance (and even if you're performing well it's worth reading on) I would look at the following solution.
For every product that you do not intend to restock or reuse, I would either add a tag, add a 301 redirect or simply remove the page and serve a 404. If we're talking tens of thousands, then having that many redirects might bloat out your .htaccess file (making it larger and longer to load/process) and having an instant drop of 20k URLs and 404 errors might look a bit odd to Google as well. However, adding 20k tags is a bit of a nightmare as well.
You might want to try a combination of all 3 - a few 404 errors is nothing to worry about - but the logic is that you will be removing a number of pages that have this duplicate content on it, thus improving the quality of the domain. For your remaining 'live' pages, I'd highly recommend taking the time to add 200+ words of unique content about the product in order to avoid this happening again.
An alternative solution would be to block the bots from accessing the /shop/ subfolder in your robots.txt file - and then setting up the shop and the currently active product listings on a different subdomain. You'd lose the ability to use the /shop/ folder, but it would be quicker than manually adding tags or 301 redirects.
That's the method I would use if I wanted to address the issue. However, this **may not be necessary **if your site is not performing badly. You can check this to a degree with the Panguin Tool - this overlays your organic traffic in analytics with Google updates - if a drop in traffic coincides with a Panda penalty, you may be under the affect of one - in which case you should take action ASAP.
Hope this helps.
-
Hi Zachery
29 pages showing up in your Moz crawl report out of 16,400 indexed pages on your site is such a small percentage (0.18% to be accurate) it is not worth worrying about. Also, if GWT is not reporting any issues I think you should be fine.
Don't worry, be happy!
Peter
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate Page Content but where?
Hi All Moz is telling me I have duplicate page content and sure enough the PA MR mT are all 0 but it doesnt give me a link to this content! This is the page: http://www.orsgroup.com/index.php?page=Scanning-services But I cant find where the duplicate content is other than on our own youtube page which I will get removed here: http://www.youtube.com/watch?v=Pnjh9jkAWuA Can anyone help please? Andy
Technical SEO | | ORS-Group0 -
Duplicate content or Duplicate page issue?
Hey Moz Community! I have a strange case in front of me. I have published a press release on my client's website and it ranked right away in Google. A week after the page completely dropped and it completely disappeared. The page is being indexed in Google, but when I search "title of the PR", the only results I get for that search query are the media and news outlets that have reported the news. No presence of my client's page. I also have to mention that I found two URLs of the same page: one with lower case letters and one with capital letters. Is this a duplicate page or a duplicate content issue coming from the news websites? How can I solve it? Thanks!
Technical SEO | | Workaholic0 -
Https Duplicate Content
My previous host was using shared SSL, and my site was also working with https which I didn’t notice previously. Now I am moved to a new server, where I don’t have any SSL and my websites are not working with https version. Problem is that I have found Google have indexed one of my blog http://www.codefear.com with https version too. My blog traffic is continuously dropping I think due to these duplicate content. Now there are two results one with http version and another with https version. I searched over the internet and found 3 possible solutions. 1 No-Index https version
Technical SEO | | RaviAhuja
2 Use rel=canonical
3 Redirect https versions with 301 redirection Now I don’t know which solution is best for me as now https version is not working. One more thing I don’t know how to implement any of the solution. My blog is running on WordPress. Please help me to overcome from this problem, and after solving this duplicate issue, do I need Reconsideration request to Google. Thank you0 -
Duplicate content problem?
Hello! I am not sure if this is a problem or if I am just making something too complicated. Here's the deal. I took on a client who has an existing site in something called homestead. Files cannot be downloaded, making it tricky to get out of homestead. The way it is set up is new sites are developed on subdomains of homestead.com, and then your chosen domain points to this subdomain. The designer who built it has kindly given me access to her account so that I can edit the site, but this is awkward. I want to move the site to its own account. However, to do so Homestead requires that I create a new subdomain and copy the files from one to the other. They don't have any way to redirect the prior subdomain to the new one. They recommend I do something in the html, since that is all I can access. Am I unnecessarily worried about the duplicate content consequences? My understanding is that now I will have two subdomains with the same exact content. True, over time I will be editing the new one. But you get what I'm sayin'. Thanks!
Technical SEO | | devbook90 -
Duplicate content with same URL?
SEOmoz is saying that I have duplicate content on: http://www.XXXX.com/content.asp?ID=ID http://www.XXXX.com/CONTENT.ASP?ID=ID The only difference I see in the URL is that the "content.asp" is capitalized in the second URL. Should I be worried about this or is this an issue with the SEOmoz crawl? Thanks for any help. Mike
Technical SEO | | Mike.Goracke0 -
Duplicated content on subcategory pages: how do I fix it?
Hello Everybody,
Technical SEO | | uMoR
I manage an e-commerce website and we have a duplicated content issue for subcategory. The scenario is like this: /category1/subcategory1
/category2/subcategory1
/category3/subcategory1 A single subcategory can fit multiple categories, so we have 3 different URL for the same subcategory with the same content (except of the navigation link). Which are the best practice to avoid this issue? Thank you!0 -
Duplicate content due to csref
Hi, When i go trough my page, i can see that alot of my csref codes result in duplicate content, when SeoMoz run their analysis of my pages. Off course i get important knowledge through my csref codes, but im quite uncertain of how much it effects my SEO-results. Does anyone have any insights in this? Should i be more cautios to use csref-codes or dosent it create problems that are big enough for me to worry about them.
Technical SEO | | Petersen110 -
URL Duplicate Content Issues (Website Transition)
Hey guys, I just transitioned my website and I have a question. I have built up all the link juice around my old url styles. To give you some clarity: My old CMS rendered links like this: www.example.com/sweatbands My new CMS renders links like this: www.example.com/sweatbands/ My new CMS's auto-sitemap also generates them with the slash on the end. Also throughout the website the CMS links to them with the slash at the end and i link to them without the slash (because it's what i am used to). I have the canonical without the slash. Should I just 301 to the version with the slash before google crawls again? I'm worried that i'll lose all the trust and ranking i built up to the one without the slash. I rank very high for certain keywords and some pages house a large portion of our traffic. What a mess! Help! 🙂
Technical SEO | | Hyrule0