How to handle a pure spam penalty (from GWT) as a blogging platform
-
Hello,
I have a blogging platform which spammers unfortunately used a few years ago to create spam blogs. Since them, we've added spam filters and even if I can't not assume there isn't any spam blog left, I can say that most of the blogs are clean.
The problem is, in Google Webmasters Tools, we have a Pure spam message in the Manual actions page. (https://support.google.com/webmasters/answer/2604777?hl=en), with a list of 1000 blog links.
All these blogs have been marked as spam in our system for at least 1 year, technically it means they return a 410 header and display something like "this blog doesn't meet our quality requirements".
When I've first seen the manual action message in GWT, I have asked for reconsideration request. Google answered within a week saying that they had checked again our website, but when I go went to the manual actions page, there was still a "pure spam" message, with a different list of blogs, which have already been marked as spam for a year at least.
What should I do ? Ask for reconsideration requests as long as Google answers ?
Thank you in advance,
-
Hi Imran,
Instead of adding to this thread, I think it would be better to start a new question about how to check a site regarding duplicate content. Thanks!
-
Hello Marie, their URL still exist but the spam content isn't displayed.
Here is what happens when you go on a blog flagged as spam :
- Header 410
- 301 redirect to a page best practices which explains why this blog has been disabled
- this page is in the robots.txt as disallow, noindex, nofollow, with no link to the original website
Is this good ?
Thanks
-
These blogs are marked as spam but do they still exist at all? I mean, if you type in the url are the pages live? If so, they're still passing pagerank. Is there a way to completely remove the pages? Somehow Google is still seeing them.
-
404s. Remove them from existence.
Why will you have content that is pure spam on the site? if it is spam, delete it.
-
Throw that spam in the 410 can. It let's the crawlers know it's gone 'for good'.
-
Hello Federico,
Thank you for your quick reply.
When you say "Clean ALL the blogs, remove any trace of spam" :
- what is the best way to do : 410 or 404 ?
- if a spam blog has a meta noindex tag, will Google still considerer it ?
I will keep you updated with the future events.
-
Steps you should followÑ
- Clean ALL the blogs, remove any trace of spam (document everything in the process)
- Go back to the first point and make sure you have NO SPAM left (again, if anything comes up, document the changes you make)
- Once you are completely certain that there's no spam left, you can send another reconsideration request, make sure you show them the work you have done to clean the site.
- Wait for their response, and if you still get a negatory, repeat the process as most likely you still have spam in your site.
Hope that helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How Best to Handle Inherited 404s on Purchased Domain
We purchased a domain from another company and migrated our site over to it very successfully. However, we have one artifact of the original domain in that there was a page that was exploited by other sites on the web. This page allowed you to pass any URL to it and redirect to that URL (e.g. http://example.com/go/to/offsite_link.asp?GoURL=http://badactor.com/explicit_content). This page does not exist on our site so the results always go to a 404 on our site. However, we find that crawlers are still attempting to access these invalid pages. We have disavowed as many of the explicit sites as we can, but still some crawlers come looking for those links. We are considering blocking the redirect page in our robots.txt but we are concerned that the links will remain indexed but uncrawlable. What's the best way to pull these pages from search engines and never have them crawled again? UPDATE: Clarifying that what we're trying to do it get search engines to just never try to get to these pages. We feel the fact they're even wasting their time on getting a 404 is what we're trying to avoid. Is there any reason we shouldn't just block these in our robots.txt?
Intermediate & Advanced SEO | | russell_ms1 -
Penalties for duplicate content
Hello!We have a website with various city tours and activities listed on a single page (http://vaiduokliai.lt/). The list changes accordingly depending on filtering (birthday in Vilnius, bachelor party in Kaunas, etc.). The URL doesn't change. Content changes dynamically. We need to make URL visible for each category, then optimize it for different keywords (for example city tours in Vilnius for a list of tours and activities in Vilnius with appropriate URL /tours-in-Vilnius).The problem is that activities overlap very often in different categories, so there will be a lot of duplicate content on different pages. In such case, how severe penalty could be for duplicate content?
Intermediate & Advanced SEO | | jpuzakov0 -
How to handle brand description on product pages?
Hi Mozzers, Hope you're doing good. I have a content placement related question. Assume, I have 1000 products of brand A, 1000 of brand B, and so on. Now, if I want to put brand specific 200-words description on each of these product pages. I'm creating duplicate content across the site by putting absolutely same brand description on these product pages i.e brand A description on first 1000 pages, brand B description on next 1000 products and so on. Looking for an expert advice around placement of content here i.e how can I add brand description on product pages and avoid duplicate content penalty? Any help?
Intermediate & Advanced SEO | | _nitman0 -
One company, two address. How do I handle footer NAP?
I have a client with two address that fall under the same brand. One address is in CA and the other is in NY. I have a single domain and will be creating separate landing pages for each location but wanted to know how I should handle the NAP in the footer of the other pages. Should I list both NAPs, one NAP or neither NAPs in the footer? Thanks in advance for your help.
Intermediate & Advanced SEO | | DigitalWorkboots0 -
Moving Part of a Website to a Subdomain to Remove Panda Penalty?
I have lots of news on my website and unlike other types of content, news posts quickly become obsolete and get a high bounce rate. I have reasons to think that the news on my website might be partly responsible for a Panda penalty so I'm not sure. There are over 400 news posts on the blog from the last 4 years so that's still a lot of content. I was thinking of isolating the news articles on a subdomain (news.mywebsite.com) If the news play a part in the Panda penalty, would that remove it from the main domain?
Intermediate & Advanced SEO | | sbrault740 -
Article Falls After Maintaining ranks for years. Page penalty?
Hello, I have had an article consistently rank between 3-5 for the last two plu syears now. Recently it dropped down to 11-13. All I did was add my Google plus picture to it. I have been hearing things along the lines of content rewrites. I am well aware of the fact that there are many duplicates of my article are out there. Is this the legitament problem though? Those articles have links to my sites. I have even found other articles that link to my article that have been duplicated. So there's all sorts of duplicate syndication out there. Wondering if I should start asking people to take down my article. Any info on recent Google activity on this subject?
Intermediate & Advanced SEO | | imageworks-2612901 -
Blog Subscribers Count Tool
So, I have about 100 different blogs that I am starting to organize into a list of potential guest blog opportunities. I wanted to see how many subscribers each blog had to their site to better help identify the top influential blogs. The only way I know how to do this is to search for the site inside of Google reader. I was wondering if anybody knows of a tool or knows a way to scrape the blog subscriber count from Google docs, or something more scalable. Thanks, Jason
Intermediate & Advanced SEO | | Jason_3420 -
I need some blogging advice please!
My name is Matthew and I am a new PRO member and founder of my own Internet marketing company in KS. So far I love the interaction and tools and functionality of seomoz. I am a true student of seo and love the subject. My dilemma is I know a blog is an important piece of any good seo campaign but I know very little about HOW to blog well......this is my new site and blog page. I only have a couple articles so far but many more planned. http://sawwebmarketing.com/seo-blog/ When I read an article that would be particuarly beneficial for my visitors can i post or share that on MY blog (giving the author the credit of course) without google thinking its duplicate content? is there anything specific I need to do with my blog for google to "see" the new, fresh content that is being added to the site? I have seen "tagged" items at the bottom of some blogs. Is this important? Some blogs will have a word or string of 2-3 words that are a link to a specific website. Does this help me or just them or just people reading the blog? **All I know is articles I write need to be relevant to my site and interesting and ORIGINAL and of benefit to my site visitors. ** Any advice that would help insure my blog articles get me all the juice they can would be GREATLY appreciated! Thank you in advance! Matthew ps - my site only went live a couple days ago so I am still working on a few onpage items but ANY feedback about the site itself would be spectacular! Have a GREAT weekend!
Intermediate & Advanced SEO | | Mrupp440