How to handle a pure spam penalty (from GWT) as a blogging platform
-
Hello,
I have a blogging platform which spammers unfortunately used a few years ago to create spam blogs. Since them, we've added spam filters and even if I can't not assume there isn't any spam blog left, I can say that most of the blogs are clean.
The problem is, in Google Webmasters Tools, we have a Pure spam message in the Manual actions page. (https://support.google.com/webmasters/answer/2604777?hl=en), with a list of 1000 blog links.
All these blogs have been marked as spam in our system for at least 1 year, technically it means they return a 410 header and display something like "this blog doesn't meet our quality requirements".
When I've first seen the manual action message in GWT, I have asked for reconsideration request. Google answered within a week saying that they had checked again our website, but when I go went to the manual actions page, there was still a "pure spam" message, with a different list of blogs, which have already been marked as spam for a year at least.
What should I do ? Ask for reconsideration requests as long as Google answers ?
Thank you in advance,
-
Hi Imran,
Instead of adding to this thread, I think it would be better to start a new question about how to check a site regarding duplicate content. Thanks!
-
Hello Marie, their URL still exist but the spam content isn't displayed.
Here is what happens when you go on a blog flagged as spam :
- Header 410
- 301 redirect to a page best practices which explains why this blog has been disabled
- this page is in the robots.txt as disallow, noindex, nofollow, with no link to the original website
Is this good ?
Thanks
-
These blogs are marked as spam but do they still exist at all? I mean, if you type in the url are the pages live? If so, they're still passing pagerank. Is there a way to completely remove the pages? Somehow Google is still seeing them.
-
404s. Remove them from existence.
Why will you have content that is pure spam on the site? if it is spam, delete it.
-
Throw that spam in the 410 can. It let's the crawlers know it's gone 'for good'.
-
Hello Federico,
Thank you for your quick reply.
When you say "Clean ALL the blogs, remove any trace of spam" :
- what is the best way to do : 410 or 404 ?
- if a spam blog has a meta noindex tag, will Google still considerer it ?
I will keep you updated with the future events.
-
Steps you should followÑ
- Clean ALL the blogs, remove any trace of spam (document everything in the process)
- Go back to the first point and make sure you have NO SPAM left (again, if anything comes up, document the changes you make)
- Once you are completely certain that there's no spam left, you can send another reconsideration request, make sure you show them the work you have done to clean the site.
- Wait for their response, and if you still get a negatory, repeat the process as most likely you still have spam in your site.
Hope that helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Starting over after a Penguin Penalty
Hi, Has anyone tried starting a new domain after being hit with a Penguin penalty? I'm considering the approach outlined here: https://searchenginewatch.com/sew/how-to/2384644/can-you-safely-redirect-users-from-a-penguin-hit-site-to-a-new-domain. In a nutshell, de-index the OLD site completely via Google's Removal Tool, and then relaunch old content under new domain. This seems to have merit, unless Google keeps a hidden cache of content (or uses other sources like Wayback Machine). My concern is doing the above listed approach, but Google still passes the old links to the new domain. We have great content, but too much spam (despite me removing a lot of the links + disavow). Any feedback based on experience would be appreciated. Thanks.
Intermediate & Advanced SEO | | mrodriguez14401 -
Blog content and panda?
If we want to create a blog to keep in front of our customers (via email and posting on social) and the posts will be around 300 - 1000 words like this site http://www.solopress.com/blog/ are we going to be asking for a panda slap as the issue would be the very little shares and traction after the first day or two. Also would panda only affect the blogs that are crap if we mix in a couple of really good posts or would it affect theses as well and possibly even the site? Any help would be appreciated.
Intermediate & Advanced SEO | | BobAnderson0 -
Pure spam Manual Action by Google
Hello Everyone, We have a website http://www.webstarttoday.com. Recently, we have received manual action from Google says "Pages on this site appear to use aggressive spam techniques such as automatically generated gibberish, cloaking, scraping content from other websites, and/or repeated or egregious violations of Google’s Webmaster Guidelines." . Google has given an example http://smoothblog.webstarttoday.com/. The nature of the business of http://www.webstarttoday.com is to creating sub-domains (website builder). Anyone can register and create sub-domains. My questions are: What are the best practices in case if someone is creating sub-domain for webstarttoday.com? How can I revoke my website from this penalty? What should i do with other hundreds of sub-domains those are already created by third party like http://smoothblog.webstarttoday.com? . Why these type of issues don't come with WordPress or weebly. ? Regards, Ruchi
Intermediate & Advanced SEO | | RuchiPardal0 -
Stopped ranking. Suspect links with keyword in blog posts and blog username. What to do? Disvow?
One of our staff thought it was a good idea to comment in 30 blogs in our niche using "keyword" as username in blog post linked to our website and additionally adding links to our website in the posts. We now got caught by panda or penguin (google confirmed no manual penalty was taken) and not ranking anymore for this keyword. No notification in webmaster tools neither. We have links from around 90 root domains of which 30 are from these blog posts. What would you suggest to do? Just building more legitimate links so that share of bad links goes down?
Intermediate & Advanced SEO | | lcourse
Using google disvow tool? We would then loose potential to get later legitimate links from these sites? Any ideas/suggestions?0 -
How accurate are the index figures in GWT?
I've been looking at a site in GWT and the number of indexed urls is very low when compared with the number or submitted urls on the xml sitemaps. The site has several stores which are all submitted using different sitemaps. When you perform a search in Google, eg site:domain.com/store1 site:domain.com/store2 site:domain.com/store3 The results are similar to the webmaster urls. However, looking in the analytics for landing pages used for organic traffic from Google shows a much higher number of pages. If these pages aren't indexed as reported in GMT, how could they be found in the results and be recorded as landing pages?
Intermediate & Advanced SEO | | edwardlewis0 -
Advice needed on how to handle alleged duplicate content and titles
Hi I wonder if anyone can advise on something that's got me scratching my head. The following are examples of urls which are deemed to have duplicate content and title tags. This causes around 8000 errors, which (for the most part) are valid urls because they provide different views on market data. e.g. #1 is the summary, while #2 is 'Holdings and Sector weightings'. #3 is odd because it's crawling the anchored link. I didn't think hashes were crawled? I'd like some advice on how best to handle these, because, really they're just queries against a master url and I'd like to remove the noise around duplicate errors so that I can focus on some other true duplicate url issues we have. Here's some example urls on the same page which are deemed as duplicates. 1) http://markets.ft.com/Research/Markets/Tearsheets/Summary?s=IVPM:LSE http://markets.ft.com/Research/Markets/Tearsheets/Holdings-and-sectors-weighting?s=IVPM:LSE http://markets.ft.com/Research/Markets/Tearsheets/Summary?s=IVPM:LSE&widgets=1 What's the best way to handle this?
Intermediate & Advanced SEO | | SearchPM0 -
Company Blog Vs External Blog
Hi there, We write articles for our blog on a regular basis, maybe two times per week. One of those articles I usually place on an external blog first getting some external links pointing into my product pages and using a rel canonical on that article on my blog pointing to the external post, so that the external post get's all the credit. The reason I put this on my blog is I use this to point to from my email marketing activities. The question is, do you think this makes best practice? trying to get more out of this blog post.
Intermediate & Advanced SEO | | Paul780 -
Do you use your own Blog networks?
Do you use a network of sites you own for links to your clients in your seo efforts? I see so many seo companies doing this from such junk sites with all their clients in the blog roll, it seems totally crazy. It seems this stuff works do any of you do this if so how do you keep it white hat?
Intermediate & Advanced SEO | | DavidKonigsberg0