How can scraper sites be successful post Panda?
-
I read this article on SEJ:
http://www.searchenginejournal.com/scrapers-and-the-panda-update/34192/
And, I'm a bit confused as to how a scraper site can be successful post Panda? Didn't panda specifically target sites that have duplicate content & shouldn't scraper sites actually be suffering?
-
I have filed several DMCA reports against scraper sites that are scraping my original content (mainly original images that my community is creating and submitting) and those scraper sites are still getting a TON of traffic... Alexa rank less than 2000) and nothing has happened. Any suggestions as to what else I could do?
-
Yes, scrapers should be suffering, but apparently Panda does not catch them all right away. The various incarnations of Panda (it is a constantly evolving part of Google's way of doing things, and not just the "big one" back in February) do target poor quality and one would assume unoriginal content.
I happen to have faith that scrapers and other spam sites will eventually get struck down, but if you are less of a believer you could try filing a DMCA report for particular instances of scraped content: http://www.google.com/support/bin/static.py?page=ts.cs&ts=1114905
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Links from a penalised site.
Hey Mozzers, Recently we have had a series of agencies in to pitch for work, one group mentioned that due to our association with a possibly penalised product review website, any links and activity associated with the brand would hinder our SEO. We currently have a good rating, but we are now no longer pushing our customers to the site as we move to a new platform. The current link back from this website is also no-followed. Any thoughts on how this could impact us? And how the agencies determined the site was penalised and causing us problems. Cheers Tim
Intermediate & Advanced SEO | | TimHolmes0 -
Old site penalised, we moved: Shall we cut loose from the old site. It's curently 301 to new site.
Hi, We had a site with many bad links pointing to it (.co.uk). It was knocked from the SERPS. We tried to manually ask webmasters to remove links.Then submitted a Disavow and a recon request. We have since moved the site to a new URL (.com) about a year ago. As the company needed it's customer to find them still. We 301 redirected the .co.uk to the .com There are still lots of bad links pointing to the .co.uk. The questions are: #1 Do we stop the 301 redirect from .co.uk to .com now? The .co.uk is not showing in the rankings. We could have a basic holding page on the .co.uk with 'we have moved' (No link). Or just switch it off. #2 If we keep the .co.uk 301 to the .com, shall we upload disavow to .com webmasters tools or .co.uk webmasters tools. I ask this because someone else had uploaded the .co.uk's disavow list of spam links to the .com webmasters tools. Is this bad? Thanks in advance for any advise or insight!
Intermediate & Advanced SEO | | SolveWebMedia0 -
Disavowing Links for Subcategory of Site
Has anyone tried using Google's Disavow tool with only a specific subcategory of their site? We're an ecommerce company and our site took a small hit with this recent Penguin update. We're certain previous linkbuilding efforts are the cause. But we'd like to try the Disavow tool with 1 subcategory to start, see if our rankings for that category improve (we used to be top 3, now ~12 or 13), and if so then roll it out through the rest of the site. Looking for input from others on if they have any experience with this or if it'd be better to just go for the whole thing at once. Thanks.
Intermediate & Advanced SEO | | Kingof50 -
PDF on financial site that duplicates ~50% of site content
I have a financial advisor client who has a downloadable PDF on his site that contains about 9 pages of good info. Problem is much of the content can also be found on individual pages of his site. Is it best to noindex/follow the pdf? It would be great to let the few pages of original content be crawlable, but I'm concerned about the duplicate content aspect. Thanks --
Intermediate & Advanced SEO | | 540SEO0 -
Panda recovery. Is it possible ?
Dear all, To begin, english is not my native language, so I'm very sorry if I make some mistake. On the 23th march 2012, Panda penalized my website (a coupon website) Avec-Reduction (dot com for the url). At this date, I have lost more than 70% of my traffic. The structure of the website was like an e-commerce website. Categories -> merchant page -> coupon page. The content was to thin for Google, I'm agree wit that. So, in may, I have made a new version. Here you can see the most important modifications : A smallest header (-100px height). 2 columns website (the oldest website had 3 columns) I have deleted the category menu with the list of all categories and the alphabetical menu. less ads on the website (since few days I have also deleted the 2 adense blocks) The coupons was promoted with the thumbs of the merchant on the home listing. Now I have few top lists in text only. I have deleted all the categories pages (one page by category of merchant, with the listing of all the merchants of the category). Now I have only one page for this. Same thing with the alphabetical pages. All these deleted pages have a 301 redirect. The 2 new pages (categorie page and alphabetical page) are on noindex. I have deleted all the promo codes pages, all the coupons are now on the merchant page (301 redirect used). I have create an anti-spam system for the review codes (I have a lot of spam on these forms, even if I cleaned every day/2days). Now, I have no spam. Visitors have now the possibility to put a note and a review for each merchant. This fonctionality is new, so not a lot of reviews for the moment. All the merchants pages without promo codes have a noindex on the robot tag. Since july, I have the possibility to use the "title" of each promo code. I can personnalised the promo code. On the same time, to have more content, I have the possibility to add sales or great promos for each merchants, not only promo codes. Affiliate links are created on JS which open a new window (a redirect page with noindex). That's the end of the most important changes on my website. I have a better speed page (/2 since july) because I have optimized my images, CSS, JS... At the end of july, I had health problem and the website has no update until the first days of october. Now, the website is updated every day, but between july and october I have no panda recovery. I have no duplicate content, I try to add the most content I can. So I don't understand why Google Panda penalized me again and again. Some of my competitors have a lot of keyword stuffing (4, 5, 6, ... 10 lines of KS on each merchant pages). Some of them have only affiliate merchants, automatic script to put coupons on websites), few "same" websites... I have less than 30% of affiliated merchant, I validate all the coupons or promo manually, I personalized all my coupons... So I don't understand what to do. I will appreciate all help. If you see problems on my wesite or if you know tips to have a panda recovery, I will be very happy to have informations. Many thanks for all. Sincerely, Florent
Intermediate & Advanced SEO | | Floroger0 -
Can the experts out here can review our site for improved performance and suggestions
Hi - we have started off with Auto Site based in India - 15 months back The site is - www.mycarhelpline.com - generating close to 2500 visits / daily basis. we aim to scale to new heights to touch atleast 10,000 visits / daily basis in coming 12 months Can we request your review to recommend for :- Link Building Site review Loading time Improvements / suggestion to take it up - (leaving aside dynamic url's) . Though may seem as SEO Audit and review - but any recommendations or suggestion will be highly appreciated.
Intermediate & Advanced SEO | | Modi0 -
One platform, multiple niche sites: Worth $60/mo so each site has different class C?
Howdy all, The short of it is that I currently run a very niche business directory/review website and am in the process of expanding the system to support running multiple sites out of the same database/codebase. In a normal setup I'd just run all the sites off of the same server with all of them sharing a single IP address, but thanks to the wonders of the cloud, it would be fairly simple for me to run each site on it's own server at a cost of about $60/mo/site giving each site a unique IP on a unique c-block (in many cases a unique a-block even.) The ultimate goal here is to leverage the authority I've built up for the one site I currently run to help grow the next site I launch, and repeat the process. The question is: Is the SEO-value that the sites can pass to each other worth the extra cost and management overhead? I've gotten conflicting answers on this topic from multiple people I consider pretty smart so I'd love to know what other people say.
Intermediate & Advanced SEO | | qurve0