Duplicate Content - Deleting Pages
-
The Penguin update in April 2012 caused my website to lose about 70% of its traffic overnight and as a consequence, the same in volume of sales. Almost a year later I am stil trying to figure out what the problem is with my site.
As with many ecommerce sites a large number of the product pages are quite similar. My first crawl with SEOMOZ identified a large number of pages that are very similar - the majority of these are in a category that doesn't sell well anyway and so to help with the problem I am thinking of removing one of my categories (about 1000 products).
My question is - would removing all these links boost the overall SEO of the site since I am removing a large chunk of near-duplicate links? Also - if I do remove all these links would I have to put in place a 301 redirect for every single page and if so, what's the quickest way of doing this.
My site is www.modern-canvas-art.com
Robin
-
Thanks for your reply Ryan. I am going to war on the bad links and have got over half of them removed already. There are one or two quite shocking examples of the manipulative linking you describe, and when I dug deeper into those sites I was utterly appalled.
-
Hi Robin,
It is important to understand the difference between Penguin and Panda.
Penguin focuses solely on a single issue: manipulative links to your site. By far the most effective solution is to remove the bad links. Penguin does not care about whether your content is duplicated or any other aspect of your site's quality.
Panda is a different update which focuses on content quality. While it is possible your site has multiple issues, you have diagnosed your issue as Penguin and stated you experienced a major traffic drop in April 2012. If that drop was on or immediately after April 24th, and you have built manipulative links to the site, it is highly likely the issue is Penguin.
A quick review of your site's backlinks in Open Site Explorer shows a large number of links which Penguin is designed to penalize....directory links and other low quality links.
I hope this information is helpful.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate content in Shopify reported by Moz
According to Moz crawl report, there are hundreds of duplicate pages in our Shopify store ewatchsale.com. The main duplicate pages are:
On-Page Optimization | | ycnetpro101
https://ewatchsale.com/collections/seiko-watches?page=2
https://ewatchsale.com/collections/all/brand_seiko
(the canonical page should be https://ewatchsale.com/collections/seiko-watches) https://ewatchsale.com/collections/seiko-watches/gender_mens
(the canonical page should be https://ewatchsale.com/collections/seiko-watches/mens-watches) Also, I want to exclude indexing of pages URLs with "filter parameters" like https://ewatchsale.com/collections/seiko-watches/color_black+mens-watches+price_us-100-200 Shopify advised we can't access our robots.txt file. How can we exclude SE crawling of the page URLs with filter names?
How can we access the robots.txt file?
How can we add canonical code to the preferred collection pages? Which templates and what codes to add? Thanks for your advice in advance!0 -
Duplicate Content
When I say duplicate content, I don't mean that the content on a clients site is displaying on another site on the web or taken from a site on the web. A client has a few product pages and each product page has content on the bottom of the page (4-5 paragraphs) describing the product. Now, this content is also displaying on other pages, but re-worded so it's not 100% duplicate. Some pages show a duplicate content % ranging from 12% to 35% and maybe 40%. Just curious if I should suggest having each product page less than 10% duplicated. Thanks for your help.
On-Page Optimization | | Kdruckenbrod0 -
Web designer doesn't see duplicate pages or issues??? Help!
I have a website that has duplicate pages, and I want to fix them. I told my web guy and his response was: "we do not see any issues on our end. It is not hurting any ranking in the 3 major searches. We have no duplicate pages" Here's the site http://www.wilkersoninsuranceagency.com/ Here's the home page again http://www.wilkersoninsuranceagency.com/index.php Here's what MOZ say: Please see attached image. I'm not sure what to tell him, as i'm not a web person, but MOZ is telling me we have issues on the site and I feel like they should be fixed?? 7QWkVU0 tYCjV
On-Page Optimization | | MissThumann0 -
Duplicate Page Content
Hi, I am new to the MOZ Pro community. I got the below message for many of my pages. We have a video site so all content in the page except the video link would be different. How can i handle such pages. Can we place adsense AD's on these pages? Duplicate Page Content Code and content on this page looks similar or identical to code and content on other pages on your site. Search engines may not know which pages are best to include in their index and rankings. Common fixes for this issue include 301 redirects, using the rel=canonical tag, and using the Parameter handling tool in Google Webmaster Central. For more information on duplicate content, visit http://moz.com/learn/seo/duplicate-content. Please help me to know how to handle this.. Regards
On-Page Optimization | | Nettv0 -
Solve duplicate content issues by using robots.txt
Hi, I have a primary website and beside that I also have some secondary websites with have same contents with primary website. This lead to duplicate content errors. Because of having many URL duplicate contents, so I want to use the robots.txt file to prevent google index the secondary websites to fix the duplicate content issue. Is it ok? Thank for any help!
On-Page Optimization | | JohnHuynh0 -
What is the best way to resolve duplicate content issue
Hi I have a client whose site content has been scraped and used in numerous other sites. This is detrimental to ranking. One term we wish to rank for is nowhere. My question is this: what's the quickest way to resolve a duplicate content issue when other sites have stolen your content? I understand that maybe I should firstly contact these site owners and 'appeal to their better nature'. This will take time and they may not even comply. I've also considered rewriting our content. Again this takes time. Has anybody experienced this issue before? If so how did you come to a solution? Thanks in advance.
On-Page Optimization | | sicseo0 -
What's the best practice for handling duplicate content of product descriptions with a drop-shipper?
We write our own product descriptions for merchandise we sell on our website. However, we also work with drop-shippers, and some of them simply take our content and post it on their site (same photos, exact ad copy, etc...). I'm concerned that we'll loose the value of our content because Google will consider it duplicated. We don't want the value of our content undermined... What's the best practice for avoiding any problems with Google? Thanks, Adam
On-Page Optimization | | Adam-Perlman0 -
Duplicate Content - Meta Data for International Site Roll Out
Hi All, We have a site targeting Ireland, so all on-page SEO is completed and launched on the Irish site. We are now rolling out this site to the UK...how much of this content & SEO meta data has to be changed for Google to not recognise it as duplicate content? Site structure is as follows: http://www.domain.com/ie-en/ - Irish site http://www.domain.com/uk-en/ - UK site Or will it even be considered duplicate content as we have the uk and Irish signals in the subfolders, will be using geo targeting on webmasters, and will have UK specific addresses and phone numbers? We will be rolling this site out to may more countries so would be great to get this straight from the start so we don't waste time creating many versions of the meta data unnecessarily! Many thanks Emma
On-Page Optimization | | john_Digino0