Recovering from Black Hat/Negative SEO with a twist
-
Hey everyone,
This is a first for me, I'm wondering if anyone has experienced a similar situation and if so, what the best course of action was for you.
Scenario
- In the process of designing a new site for a client, we discovered that his previous site, although having decent page rank and traffic had been hacked. The site was built on Wordpress so it's likely there was a vulnerability somewhere that allowed someone to create loads of dynamic pages; www.domain.com/?id=102, ?id=103, ?id=104 and so on. These dynamic pages ended up being malware with a trojan horse our servers recognized and subsequently blocked access to.
We have since helped them remedy the vulnerability and remove the malware that was creating these crappy dynamic pages.
- Another automated program appears to have been recently blasting spam links (mostly comment spam and directory links) to these dynamically created pages at an incredibly rapid rate, and is still actively doing so. Right now we're looking at a small business website with a touch over 500k low-quality spammy links pointing to malware pages from the previously compromised site.
Important: As of right now, there's been no manual penalty on the site, nor has a "This Site May Have Been Compromised" marker in the organic search results for the site. We were able to discover this before things got too bad for them.
Next Steps?
The concern is that when the Penguin refresh occurs, Google is going to notice all these garbage links pointing to those malware pages and then potentially slap a penalty on the site. The main questions I have are:
- Should we report this proactively to the web spam team using the guidelines here? (https://www.google.com/webmasters/tools/spamreport?hl=en&pli=1)
- Should we request a malware review as recommended within the same guidelines, keeping in mind the site hasn't been given a 'hacked' snippet in the search results? (https://support.google.com/webmasters/topic/4598410?hl=en&ref_topic=4596795)
- Is submitting a massive disavow links file right now, including the 490k-something domains, the only way we can escape the wrath of Google when these links are discovered? Is it too hopeful to imagine their algorithm will detect the negative-SEO nature of these links and not give them any credit?
Would love some input or examples from anyone who can help, thanks in advance!
-
I never mentioned anything about Pigeon?
-
Um....IQ? Did you miss the Pigeon update of a couple of months ago?
Tons of talk on same, my own fav from Mike here -
http://blumenthals.com/blog/2014/10/05/post-pigeon-geo-assessment-how-did-traffic-change-by-city/
-
Should we report this proactively to the web spam team using the guidelines here? No
**Should we request a malware review as recommended within the same guidelines, keeping in mind the site hasn't been given a 'hacked' snippet in the search results? **
No
**Is submitting a massive disavow links file right now, including the 490k-something domains, the only way we can escape the wrath of Google when these links are discovered? Is it too hopeful to imagine their algorithm will detect the negative-SEO nature of these links and not give them any credit? **
Yes
This sounds to me like you need to be thinking 'damage limitation', and by submitting a disavow now, you will be doing just this. Don't worry about the fact there are so many domains there, that is what the tool is all about. However, Penguin hasn't had a refresh in some time (12 months), so one might consider this and think that while you have time on your side to fix it, a refresh could be round the corner - so hop on it
-Andy
-
Sounds like fun!
I did write a lovely answer which unfortunately got lost so I'll summaries a bit below-
1. I wouldn't recommend telling Google as you might not have a penalty now but you might be temping Googles wrath
2. As you've not been marked as malware and you've removed it you should be fine but you can always try if you want to sleep better
3.Disavow proactively is a great idea Google like this approach too, It also means rather than hoping Google might ignore the links its will defiantly ignore them with the disavow list.. Further to this I've got two more options for you. you can block wildcard/dynamic pages in your Robots which will help stop Google even getting to them to find out you've got some bad links assuming you don't need the pages for your site. If you check your referring domains weekly and update the disavow list as well if you're still "under attack".
Just a quick heads up after disavowing the link you may drop down in rankings as you're removing the links however there is also a chance you can go up if you're under a algo penalty.
You can find some good tips here too - http://www.searchenginejournal.com/combat-recover-negative-seo-attack-survival-guide/114507/
Hope some of that helps and I wish I could of posted my reply but I don't have the time to rewrite it I'm afraid. Good luck to you!
-
I have a lot going on right now, but if you PM the domain, I can take a look in a week or so.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Want to remove a large amount of links from spam sites. SEO company says we will lose a lot of link juice?
Hi, We have a lot of links that have a spam score above 30% and 60%. I don't know if someone has spammed our website. However our SEO company has said we should remove these carefully over a period of 3 months while they add new good links. I don't quite trust this advice. Are they trying to get more business?? They have put doubt in our mind. Can anyone please shed any light on this?? Thank you
White Hat / Black Hat SEO | | YvonneDupree0 -
Do home page carry more seo benefit than other pages?
hi, i would like to include my kws in the URL and they are under 50 characters. is there anything in the algo that tells engines to give more importance to homepage?
White Hat / Black Hat SEO | | alan-shultis0 -
White hat or black hat?
There seems to be very differing opinions on what is good practice (white hat) and what is not (Black hat) and I'm not sure which way to lean (although my inclinations are slightly to the white). I'm starting a business offering a service and see ranking position 1-3 in the serps as my key to success. I'm creating good and useful content on my site and without much effort beyond on page seo have reached page 4 google for a few choice keywords. I feel that with a small number of links to a few of my pages i can reach page 1 and here is where my dilemma begins. With a bit of investment in some software (£400-600 for 3 different products) I can start Tiered linkbuilding (in a black hat way) and get results quickly but potentially risking my site in the eyes of google. I've been doing a little outreach to gain links in a whiter way but not had much success yet. I'm keen to keep with the whiter side but see progress as slower. Am I wrong? Can i build a robust link profile in a white hat way rapidly? Are there any quick wins i can gain to give me confidence? Why is white hat better than black hat? All wisdom, experience, guidance and humour gratefully received.
White Hat / Black Hat SEO | | roadhaulageservices0 -
.com geotagging redirect to subdomains - will it affect SEO?
Hi guys, We have a .com domain and we've got geoIP on it, so UK goes to .co.uk and USA goes to .com/us We're just migrating over to another platform so we're thinking of keeping a "dummy" server just to do this geoIP pointing for us. Essentially .com will just point over to the right place and hold a specific .com/abc (which is generic for everyone worldwide) Current Scenario:
White Hat / Black Hat SEO | | Infruition
.com (Magento + geoIP)
.com/us (US Magento)
.co.uk (UK - geoIP redirect to Shopify)
.com/abc (sits on Magento server) Wanted Scenario:
.com - used for GEOIP and a specific .com/abc (for all users)
.co.uk (UK) - Shopify eCom
.com/us -> migration to us.xx.com (USA) - Shopify eCom I just wanted to know if this will affect our rankings on google? Also, any advice as to the best practises here would be great. Thanks! Nitesh0 -
What do you think of this "SEO software" that uses Rand's "proven method" ?
I saw an ad on Search Engine Roundtable and the call to action was... "What is the #1 metric that Google uses to rank websites?" I thought, "I gotta know that!". (I usually don't click ads but this one tempted me.) So I clicked in and saw a method "proven by Rand Fishkin" that will "boost the rankings of your website". This company has software that will use Rand's proven method (plus data from another unattributed test to boost the rankings of your website). I am not going to use this software. The video made my BS meter ring. But if you want to see it.... http://crowdsearch.me/special-backdoor/ Rather than use this "software", I would suggest using kickass title tags that deliver the searcher to kickass content. That has worked really well for me for years. Great title tags and great content will produce the same results. The bonus for you is that the great content will give you a real website.
White Hat / Black Hat SEO | | EGOL1 -
LOCAL SEO / Ranking for the difficult 'service areas' outside of the primary location?
It's generally not too hard to rank in Google Places and organically for your primary location. However if you are a service area business looking to rank for neighboring cities or service areas, Google makes this much tougher. Andrew Shotland mentions the obvious and not so obvious options: Service Area pages ranking organically, getting a real/virtual address, boost geo signals, and using zip codes instead of service area circle. But I am wondering if anyone had success with other methods? Maybe you have used geo-tagging in a creative way? This is a hurdle that many local business are struggling with and any experience or thoughts will be much appreciated
White Hat / Black Hat SEO | | vmialik1 -
Rollover design & SEO
After reading this article http://www.seomoz.org/blog/designing-for-seo some questions came up from my developers. In the article it says "One potential solution to this problem is a mouse-over. Initially when viewed, the panel will look as it does on the left hand side (exactly as the designer want it), yet when a user rolls over the image the panel changes into what you see on the right hand side (exactly what the SEO wants)." My developers say" Having text in the rollovers is almost like hiding text and everyone knows in SEO that you should never hide text. "In the article he explains that it is not hidden text since its visible & readable by the engines.What are everyone's thoughts on this? Completely acceptable or iffy?Thanks
White Hat / Black Hat SEO | | DCochrane0 -
Is our sub-domain messing up our seo for our root?
We have a website (mysite.com) that we control and a subdomain (affiliate.mysite.com) that is 3rd-Party Content completely out of our control. I've found that nearly all or our Crawl Errors are coming from this subdomain. Same deal with 95% of our warnings: they all come from the subdomain. The two website are very much interlinked, as the subdomain serves up the header and footer of the root domain through iFrames and the 3rd-party content in the middle-section. On the root domain there are countless links pointing at this 3rd-party subdomain. How do these errors affect the root domain, and how do you propose we address the issue?
White Hat / Black Hat SEO | | opusbyseo0