Recovering from Black Hat/Negative SEO with a twist
-
Hey everyone,
This is a first for me, I'm wondering if anyone has experienced a similar situation and if so, what the best course of action was for you.
Scenario
- In the process of designing a new site for a client, we discovered that his previous site, although having decent page rank and traffic had been hacked. The site was built on Wordpress so it's likely there was a vulnerability somewhere that allowed someone to create loads of dynamic pages; www.domain.com/?id=102, ?id=103, ?id=104 and so on. These dynamic pages ended up being malware with a trojan horse our servers recognized and subsequently blocked access to.
We have since helped them remedy the vulnerability and remove the malware that was creating these crappy dynamic pages.
- Another automated program appears to have been recently blasting spam links (mostly comment spam and directory links) to these dynamically created pages at an incredibly rapid rate, and is still actively doing so. Right now we're looking at a small business website with a touch over 500k low-quality spammy links pointing to malware pages from the previously compromised site.
Important: As of right now, there's been no manual penalty on the site, nor has a "This Site May Have Been Compromised" marker in the organic search results for the site. We were able to discover this before things got too bad for them.
Next Steps?
The concern is that when the Penguin refresh occurs, Google is going to notice all these garbage links pointing to those malware pages and then potentially slap a penalty on the site. The main questions I have are:
- Should we report this proactively to the web spam team using the guidelines here? (https://www.google.com/webmasters/tools/spamreport?hl=en&pli=1)
- Should we request a malware review as recommended within the same guidelines, keeping in mind the site hasn't been given a 'hacked' snippet in the search results? (https://support.google.com/webmasters/topic/4598410?hl=en&ref_topic=4596795)
- Is submitting a massive disavow links file right now, including the 490k-something domains, the only way we can escape the wrath of Google when these links are discovered? Is it too hopeful to imagine their algorithm will detect the negative-SEO nature of these links and not give them any credit?
Would love some input or examples from anyone who can help, thanks in advance!
-
I never mentioned anything about Pigeon?
-
Um....IQ? Did you miss the Pigeon update of a couple of months ago?
Tons of talk on same, my own fav from Mike here -
http://blumenthals.com/blog/2014/10/05/post-pigeon-geo-assessment-how-did-traffic-change-by-city/
-
Should we report this proactively to the web spam team using the guidelines here? No
**Should we request a malware review as recommended within the same guidelines, keeping in mind the site hasn't been given a 'hacked' snippet in the search results? **
No
**Is submitting a massive disavow links file right now, including the 490k-something domains, the only way we can escape the wrath of Google when these links are discovered? Is it too hopeful to imagine their algorithm will detect the negative-SEO nature of these links and not give them any credit? **
Yes
This sounds to me like you need to be thinking 'damage limitation', and by submitting a disavow now, you will be doing just this. Don't worry about the fact there are so many domains there, that is what the tool is all about. However, Penguin hasn't had a refresh in some time (12 months), so one might consider this and think that while you have time on your side to fix it, a refresh could be round the corner - so hop on it
-Andy
-
Sounds like fun!
I did write a lovely answer which unfortunately got lost so I'll summaries a bit below-
1. I wouldn't recommend telling Google as you might not have a penalty now but you might be temping Googles wrath
2. As you've not been marked as malware and you've removed it you should be fine but you can always try if you want to sleep better
3.Disavow proactively is a great idea Google like this approach too, It also means rather than hoping Google might ignore the links its will defiantly ignore them with the disavow list.. Further to this I've got two more options for you. you can block wildcard/dynamic pages in your Robots which will help stop Google even getting to them to find out you've got some bad links assuming you don't need the pages for your site. If you check your referring domains weekly and update the disavow list as well if you're still "under attack".
Just a quick heads up after disavowing the link you may drop down in rankings as you're removing the links however there is also a chance you can go up if you're under a algo penalty.
You can find some good tips here too - http://www.searchenginejournal.com/combat-recover-negative-seo-attack-survival-guide/114507/
Hope some of that helps and I wish I could of posted my reply but I don't have the time to rewrite it I'm afraid. Good luck to you!
-
I have a lot going on right now, but if you PM the domain, I can take a look in a week or so.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does same description in the directories of all affect SEO or not? - But unique on the website
Hi, I would like to do some directories. When I checked with a person for his recent work, he has given the same description in 50 directories he has done for a client. Does this affect SEO or not?
White Hat / Black Hat SEO | | AnuManish0 -
How to find trustful seo specialist?
How to find trustful seo specialist if you don't know about SEO a lot?
White Hat / Black Hat SEO | | DigiVital1 -
More or Less pages helps in SEO?
Hi all, I have gone through some articles where less pages are suggested and they claim that they will be favoured by Google. I'm not sure as with limited pages, we can only target limited keywords. There might be threat from Google in-terms of doorway pages for more pages. But one of our competitor has many pages like dedicated page for every keyword. And their website ranks high and good for all keywords. I can see three pages created with differnet phrases for same on keyword. If less pages are good, how come this works for our competitor? Thanks
White Hat / Black Hat SEO | | vtmoz0 -
Wanna see Negative SEO?
One of my clients got hit with negative SEO in the past few days. Check it out in ahrefs. The site is www.thesandiegocriminallawyer.com. Any advice on what, if anything, I should do? Google disavow? Thanks.
White Hat / Black Hat SEO | | mrodriguez14401 -
Unique page URLs and SEO titles
www.heartwavemedia.com / Wordpress / All in One SEO pack I understand Google values unique titles and content but I'm unclear as to the difference between changing the page url slug and the seo title. For example: I have an about page with the url "www.heartwavemedia.com/about" and the SEO title San Francisco Video Production | Heartwave Media | About I've noticed some of my competitors using url structures more like "www.competitor.com/san-francisco-video-production-about" Would it be wise to follow their lead? Will my landing page rank higher if each subsequent page uses similar keyword packed, long tail url? Or is that considered black hat? If advisable, would a url structure that includes "san-francisco-video-production-_____" be seen as being to similar even if it varies by one word at the end? Furthermore, will I be penalized for using similar SEO descriptions ie. "San Francisco Video Production | Heartwave Media | Portfolio" and San Francisco Video Production | Heartwave Media | Contact" or is the difference of one word "portfolio" and "contact" sufficient to read as unique? Finally...am I making any sense? Any and all thoughts appreciated...
White Hat / Black Hat SEO | | keeot0 -
Please Correct This on-site SEO strategy w/ respect to all the updates
Hello, I believe my on-site SEO process that I used to use a couple of years ago is not working well anymore for a couple of my sites, including this one. I'll tell you the old strategy as well as my new strategy and I'm wondering if you can give me pointers that will help us rank where we should rank with our PA and DA instead of getting moved down because of what could be our old on-site SEO. OLD ON-SITE SEO STRATEGY: Title tags usually match the page, but title tags occasionally on this site don't match the pages exactly. There's not many of them, but they do still exist in a couple of places. Title tags are either 1. A phrase describing the page 2. Keywords 1, Keyword 2 3. Keyword 1 | Keyword 2 4. Keywords 1, Keyword 2, branding The keywords are in the h1 and h2 of each main page, at the very top of the page. The h1 and h2 do not exactly copy the title tag, but are a longer phrase with the keywords appearing in their exact word order or in word variations. See this page for an example. Keywords occur 3-4 times in the body of the main pages (the pages with a menu link). Right now some of the pages have the exact phrases 3 or 4 times and no variation. meta description tags have exact keyword phrases once per keyword. Meta description tag are a short paragraph describing the page. No meta keyword tags, but a couple haven't been deleted yet. FUTURE ON-SITE SEO STRATEGY: I'm going to change all of the page titles to make sure they match the content they're on exactly. If the title is a phrase describing a page, I'm going to make sure a variation of that phrase occurs at least three times in the content, and once in the meta description tag. Title tags will be either a. Short phrase exactly matching page b. Keyword 1, Keyword 2 | branding c. Keyword 1 | branding 2. I'm thinking about taking out the H1 and H2 and replacing them with one tag that is a phrase describing the page that I'll sometimes put the keyword phrase in, only a variation in it and not the exact keyword phrase - unless it just makes total sense to use the keyword phrase exactly. **I'm thinking of only using the keyword phrase in it's exact words once on the page unless it occurs more naturally, and to include the keyword phrase in word variations two more times. So once (in non-exact word order) in the at the top, once (exact word order) in the text, and two more times (varied word orders) somewhere in the text. All this will be different if the keywords show up naturally in the text. **3. I'll delete all meta keyword tags, and still use exact keyword phrases in meta description tag, though I'll change the meta description tags to always very closely match what the page is about. Do you think my new strategy will make a difference? Your thoughts on any of this?****
White Hat / Black Hat SEO | | BobGW0 -
Negative SEO and when to use to Dissavow tool?
Hi guys I was hoping someone could help me on a problem that has arisen on the site I look after. This is my first SEO job and I’ve had it about 6 months now. I think I’ve been doing the right things so far building quality links from reputable sites with good DA and working with bloggers to push our products as well as only signing up to directories in our niche. So our backlink profile is very specific with few spammy links. Over the last week however we have received a huge increase in backlinks which has almost doubled our linking domains total. I’ve checked the links out from webmaster tools and they are mainly directories or webstat websites like the ones below | siteinfo.org.uk deperu.com alestat.com domaintools.com detroitwebdirectory.com ukdata.com stuffgate.com | We’ve also just launched a new initiative where we will be producing totally new and good quality content 4-5 times a week and many of these new links are pointing to that page which looks very suspicious to me. Does this look like negative Seo to anyone? I’ve read a lot about the disavow tool and it seems people’s opinions are split on when to use it so I was wondering if anyone had any advice on whether to use it or not? It’s easy for me to identify what these new links are, yet some of them have decent DA so will they do any harm anyway? I’ve also checked the referring anchors on Ahrefs and now over 50% of my anchor term cloud are totally unrelated terms to my site and this has happened over the last week which also worries me. I haven’t seen any negative impact on rankings yet but if this carries on it will destroy my link profile. So would it be wise to disavow all these links as they come through or wait to see if they actually have an impact? It should be obvious to Google that there has been a huge spike in links so then the question is would they be ignored or will I be penalised. Any ideas? Thanks in advance Richard
White Hat / Black Hat SEO | | Rich_9950 -
750,000 pv/month due to webspam. What to do?
Let's say your user-generated content strategy is wildly successful, in a slightly twisted sense: webspammers fill it with online streaming sports teasers and the promise of "Weeds season 7 episode 11." As a result of hard SEO work done to build the profile of the domain, these webspam pages seem to rank well in Google, and deliver nearly 750k pageviews, and many many unique visitors, to the site every month. The ad-sales team loves the traffic boost. Overall traffic, uniques, and search numbers look rosy. What do you do? a) let it ride b) throw away roughly half your search traffic overnight by deleting all the spam and tightening the controls to prevent spammers from continuing to abuse the site There are middle-ground solutions, like using NOINDEX more liberally on UGC pages, but the end result is the same as option (b) even if it takes longer to get there.
White Hat / Black Hat SEO | | mcglynn0