Why is this store getting hurt in SERPs when they removed duplicate content?
-
I work with an e-commerce client who got hit hard by Panda. They are very cautious, and want small-scale tests to prove each hypothesis before committing to larger changes.
Recently, we reworked content on 30 product detail pages. Before, these product pages featured some original content mixed with some manufacturer content. The change we made was to remove the manufacturer content completely from the product page, leaving about 300 words of high-quality, original content--all of which was written by subject matter experts.
I assumed that Google viewed this manufacturer text as duplicate content. However, when these 30 modified pages were compared to the control, they performed significantly worse.
Question 1: Does any have any idea why these pages would perform worse than the control?
Question 2: Do you have any tips for convincing this client to try another test or get the buy-in to make the larger changes that--in theory--need to happen?FWIW, this client has about 10,000 product detail pages--the vast majority of which contain just manufacturer content.
I appreciate your thoughts.
-
I would be curious to look at each form of the pages using the LDA tool.
The main reason to use the tool is to understand if you are significantly reducing the page's relevancy to your target keyword(s).
-
Hi Ryan,
Thank you for your thoughtful reply.
We didn't make any changes to any of the titles or any overall site changes. We measured the changes using internal tracking tools to get the daily traffic by search engine to each product detail page. We also used Google Webmaster Tools to estimate SERP positional data.
The updated pages do have about half as many total words as the control, so that could definitely explain it.
For followup testing, I'm thinking of trying:
- test versus control, same word count, test is 100% original, control is 0% original
- test versus control, same word count, test is 50% original, control is 0% original
- test versus control, same word count, test is 25% original, control is 0% original
-
Hi Sarah.
I can't blame the client for wanting small tests. It's actually a prudent strategy.
I think having some manufacturer content mixed in with other quality content is fine. Normally 300 words would be a bit thin but for product pages that is not bad at all.
What changes to the page were made other then the content? Did the title change at all?
How exactly are you measuring the performance of these pages. Please provide as much detail as possible.
Is there any other site wide changes being made at the same time?
Does any have any idea why these pages would perform worse than the control?
Assuming there are no other factors involved, which is a big assumption, the only logical response I can offer is the revised pages are viewed by Google as weaker. By removing the manufacturer's content the pages are thinner and less relevant to the keywords involved.
Do you have any tips for convincing this client to try another test or get the buy-in to make the larger changes that--in theory--need to happen?
How to restore a client's confidence after making an error?
Assure the client you realize they are highly dependent upon their rankings for sales. You are proceeding under your experiences and knowledge, which has been successful in the past. Panda is a new thing and Google is still making rolling out new changes. The industry is still learning and adjusting to these changes. You are consulting with others regarding the unexpected results and you are committed to ensuring this project is properly completed.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate Content & Tags
I've recently added tags to my blog posts so that related blog posts are suggested to visitors. My understanding was that my robot.txt was handling duplicate content so thought it wouldn't be an issue but after Moz crawled by site this week is reported 56 issues of duplicate content in my blog. I'm using Shopify, so I can edit the robot.txt file but is my understanding correct that if there are 2 or more tags then they will be ignored? I've searched the Shopify documents and forum and can't find a straight answer. My understanding of SEO is fairly limited. Disallow: /blogs/+
Content Development | | Tangled
Disallow: /blogs/%2B
Disallow: /blogs/%2b0 -
Can We Publish Duplicate Content on Multi Regional Website / Blogs?
Today, I was reading Google's official article on Multi Regional website and use of duplicate content. Right now, We are working on 4 different blogs for following regions. And, We're writing unique content for each blog. But, I am thinking to use one content / subject for all 4 region blogs. USA: http://www.bannerbuzz.com/blog/ UK: http://www.bannerbuzz.co.uk/blog/ AUS: http://www.bannerbuzz.com.au/blog/ CA: http://www.bannerbuzz.ca/blog/ Let me give you very clear ideas on it. Recently, We have published one article on USA website. http://www.bannerbuzz.com/blog/choosing-the-right-banner-for-your-advertisement/ And, We want to publish this article / blog on UK, AUS & CA blog without making any changes. I have read following paragraph on Google's official guidelines and It's inspire me to make it happen. Which is best solution for it? Websites that provide content for different regions and in different languages sometimes create content that is the same or similar but available on different URLs. This is generally not a problem as long as the content is for different users in different countries. While we strongly recommend that you provide unique content for each different group of users, we understand that this may not always be possible. There is generally no need to "hide" the duplicates by disallowing crawling in a robots.txt file or by using a "noindex" robots meta tag. However, if you're providing the same content to the same users on different URLs (for instance, if both example.de/ and example.com/de/ show German language content for users in Germany), you should pick a preferred version and redirect (or use the rel=canonical link element) appropriately. In addition, you should follow the guidelines on rel-alternate-hreflang to make sure that the correct language or regional URL is served to searchers.
Content Development | | CommercePundit0 -
How much duplicate content counts as duplicate content to Google?
Hi everyone! I've had a look through some duplicate content posts and I can't see the answer to this query, so I thought I'd ask in case someone could help. I've been looking at a website that competes with the site that I work on. They have profile pages containing content that has been copied and pasted straight from the suppliers' websites. Their pages have all their own code framing the content, which is diluting the concentration of duplicate copy. How much duplicate content can a page have before it gets penalised or ignored by Google? Any suggestions very gratefully received 🐵
Content Development | | ceecee0 -
Questions about reorganizing a website's content structure
I'm working with a publisher who is considering reorganizing the content on their site, which up to now has been presented more or less as a portal site for a variety of segments in a particular business industry. One scenario that is on the table is to remove some content sections altogether and republish them under their own unique domains. It's important to note that this reorganization would be part of a new brand strategy. So my first question is whether this is one of those avoid-at-all-costs scenarios? My second question is if we decide to procede, what kind of time could it take for these new domains to generate the same level of search traffic they are currently pulling in on the portal site? Thanks!
Content Development | | accessintel0 -
RSS feeds with dup content and titles
Hi, For my Buddypress site I use a tool to create sites with RSS feeds. Each site is for a different feed, but the number of dup tiles and content is running in the thousands. I've been trying to reduce the dups, but have begun to think there is more trouble from such content than benefit. Should I dump the content or ignore the errors flagged by SEOMOZ? Any ideas if thes RSS feed dups are hurting my BuddyPress site? Any suggestions in general about how to eliminate such dupe for a Buddypress Site, eg. the activity log. Larry
Content Development | | tishimself0 -
Is it possible to over create/post content?
My company has signed up with a vendor to help write content. They are going to be supplying 50 articles per month so we'd basically be adding 2 or more articles per day. Is that too much or is that okay as long as it's quality content?
Content Development | | baudvilleweb0 -
Tools to Eval Blog Content - Rate your Fav tool
Ok, so I know that is has been covered in depth and at the risk of being sent to “google it!” (Which I have done with no success) I thought that I would ask your opinions on the topic. What are the best content marketing evaluation tools? By this I am specifically referring to tools that evaluate the content of Blogs, etc and not the performance of the blog, etc. I’m eager to hear your thoughts of what works and if you care to share what tools did not. Thanks
Content Development | | Questionmana0 -
What is the best way to remove old pages (if at all)?
Hi, I have a client who has thousands of pages on his site - 50,000+. It is a news website so most of these pages are old news articles and blog posts that receive very little traffic. We are moving to a new content management system and are debating on whether or not to keep all that old content. So far our decision is to keep the content that has gotten at least 100+ visits from Bing or Google in the last 6 months but dump everything else. This amounts to around 30,000 or so pages most of which have several links pointing to them. My question is from an SEO standpoint is that okay to do? We'd not only lose pages but links as well. Part of me thinks that in light of the Panda update getting rid of old content that is good but not great could about help out the site (we do great in the SERPs and actually got a bump in traffic after the Panda update to new articles/posts). However, we obviously don't want to cause problems and that is why I'd appreciate the thoughts and ideas on the best way to handle this major downsizing in content. Thanks!
Content Development | | Matthew_Edgar0