Recovering from Black Hat/Negative SEO with a twist
-
Hey everyone,
This is a first for me, I'm wondering if anyone has experienced a similar situation and if so, what the best course of action was for you.
Scenario
- In the process of designing a new site for a client, we discovered that his previous site, although having decent page rank and traffic had been hacked. The site was built on Wordpress so it's likely there was a vulnerability somewhere that allowed someone to create loads of dynamic pages; www.domain.com/?id=102, ?id=103, ?id=104 and so on. These dynamic pages ended up being malware with a trojan horse our servers recognized and subsequently blocked access to.
We have since helped them remedy the vulnerability and remove the malware that was creating these crappy dynamic pages.
- Another automated program appears to have been recently blasting spam links (mostly comment spam and directory links) to these dynamically created pages at an incredibly rapid rate, and is still actively doing so. Right now we're looking at a small business website with a touch over 500k low-quality spammy links pointing to malware pages from the previously compromised site.
Important: As of right now, there's been no manual penalty on the site, nor has a "This Site May Have Been Compromised" marker in the organic search results for the site. We were able to discover this before things got too bad for them.
Next Steps?
The concern is that when the Penguin refresh occurs, Google is going to notice all these garbage links pointing to those malware pages and then potentially slap a penalty on the site. The main questions I have are:
- Should we report this proactively to the web spam team using the guidelines here? (https://www.google.com/webmasters/tools/spamreport?hl=en&pli=1)
- Should we request a malware review as recommended within the same guidelines, keeping in mind the site hasn't been given a 'hacked' snippet in the search results? (https://support.google.com/webmasters/topic/4598410?hl=en&ref_topic=4596795)
- Is submitting a massive disavow links file right now, including the 490k-something domains, the only way we can escape the wrath of Google when these links are discovered? Is it too hopeful to imagine their algorithm will detect the negative-SEO nature of these links and not give them any credit?
Would love some input or examples from anyone who can help, thanks in advance!
-
I never mentioned anything about Pigeon?
-
Um....IQ? Did you miss the Pigeon update of a couple of months ago?
Tons of talk on same, my own fav from Mike here -
http://blumenthals.com/blog/2014/10/05/post-pigeon-geo-assessment-how-did-traffic-change-by-city/
-
Should we report this proactively to the web spam team using the guidelines here? No
**Should we request a malware review as recommended within the same guidelines, keeping in mind the site hasn't been given a 'hacked' snippet in the search results? **
No
**Is submitting a massive disavow links file right now, including the 490k-something domains, the only way we can escape the wrath of Google when these links are discovered? Is it too hopeful to imagine their algorithm will detect the negative-SEO nature of these links and not give them any credit? **
Yes
This sounds to me like you need to be thinking 'damage limitation', and by submitting a disavow now, you will be doing just this. Don't worry about the fact there are so many domains there, that is what the tool is all about. However, Penguin hasn't had a refresh in some time (12 months), so one might consider this and think that while you have time on your side to fix it, a refresh could be round the corner - so hop on it
-Andy
-
Sounds like fun!
I did write a lovely answer which unfortunately got lost so I'll summaries a bit below-
1. I wouldn't recommend telling Google as you might not have a penalty now but you might be temping Googles wrath
2. As you've not been marked as malware and you've removed it you should be fine but you can always try if you want to sleep better
3.Disavow proactively is a great idea Google like this approach too, It also means rather than hoping Google might ignore the links its will defiantly ignore them with the disavow list.. Further to this I've got two more options for you. you can block wildcard/dynamic pages in your Robots which will help stop Google even getting to them to find out you've got some bad links assuming you don't need the pages for your site. If you check your referring domains weekly and update the disavow list as well if you're still "under attack".
Just a quick heads up after disavowing the link you may drop down in rankings as you're removing the links however there is also a chance you can go up if you're under a algo penalty.
You can find some good tips here too - http://www.searchenginejournal.com/combat-recover-negative-seo-attack-survival-guide/114507/
Hope some of that helps and I wish I could of posted my reply but I don't have the time to rewrite it I'm afraid. Good luck to you!
-
I have a lot going on right now, but if you PM the domain, I can take a look in a week or so.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Buying domains for the backlink profile: Still a white hat strategy?
There's a DR 51 domain we'd like to buy, with a quality backlink profile. We'd like to 301 redirect this domain to our DR 46 domain, and possibly setup something to make the user experience smooth for people expecting the old domain. Is this still a white hat strategy? How would you calculate the value/what kind of offer to make?
White Hat / Black Hat SEO | | catbur0 -
More pages is good for SEO? Is this true?
Hi Guys I have a question, I was told the more pages I have the better for SEO, Is this true?
White Hat / Black Hat SEO | | edward-may0 -
On the use of Disavow tool / Have I done it correctly, or what's wrong with my perception?
On a site I used GSA search engine ranker. Now, I got good links out of it. But, also got 4900 links from one domain. And, I thought according to ahrefs. One link from the one domain is equal to 4900 links from one domain. So, I downloaded links those 4900 and added 4899 links to disavow tool. To disavow, to keep my site stable at rankings and safe from any future penalty. Is that a correct way to try disavow tool? The site rankings are as it is.
White Hat / Black Hat SEO | | AMTrends0 -
Sitelinks Search Box impact for SEO
I am wondering how the relatively new sitelinks search box impacts the SEO rankings for a specific site or keyword combination - do you guys have any experience or bechmarks on this? Obviously it should help on getting more real estate on the SERP page (due to adding the search box), but do you also get extra goodwill and improved SERP position from adding it? Also, is the impact different on different type of terms, let's say single brand or category term such as "Bestbuy" (or "coupon") or a combination term "Bestbuy Apple" (or "Dixons coupon")? Thanks in advance!
White Hat / Black Hat SEO | | tjr0 -
The differences between XXX.domain.com and domain.com/XXX?
hi guys i would like to know which seo value is better? for example if i would put a link in xxx.domain.com or domain.com/XXX which one will give me a better seo value? does it give the same? assuming that domain.com have a huge PR RANK itself. why do people bother making XXX.domain.com instead? hope for clarification thanks!
White Hat / Black Hat SEO | | andzon0 -
Penguin 2.1: How to recover?
I know Penguin focuses on links but do you need to personally reach out and try to manually remove the links, or can you simply place the bad links in the disavow tool. I know for manual penalties you must manually reach out and try to remove and use disavow as an absolute last resort. Does the same go for algorithm penalties? Any insight would be helpful.
White Hat / Black Hat SEO | | WebServiceConsulting.com0 -
Looking for a Way to Standardize Content for Thousands of Pages w/o Getting Duplicate Content Penalties
Hi All, I'll premise this by saying that we like to engage in as much white hat SEO as possible. I'm certainly not asking for any shady advice, but we have a lot of local pages to optimize :). So, we are an IT and management training course provider. We have 34 locations across the US and each of our 34 locations offers the same courses. Each of our locations has its own page on our website. However, in order to really hone the local SEO game by course topic area and city, we are creating dynamic custom pages that list our course offerings/dates for each individual topic and city. Right now, our pages are dynamic and being crawled and ranking well within Google. We conducted a very small scale test on this in our Washington Dc and New York areas with our SharePoint course offerings and it was a great success. We are ranking well on "sharepoint training in new york/dc" etc for two custom pages. So, with 34 locations across the states and 21 course topic areas, that's well over 700 pages of content to maintain - A LOT more than just the two we tested. Our engineers have offered to create a standard title tag, meta description, h1, h2, etc, but with some varying components. This is from our engineer specifically: "Regarding pages with the specific topic areas, do you have a specific format for the Meta Description and the Custom Paragraph? Since these are dynamic pages, it would work better and be a lot easier to maintain if we could standardize a format that all the pages would use for the Meta and Paragraph. For example, if we made the Paragraph: “Our [Topic Area] training is easy to find in the [City, State] area.” As a note, other content such as directions and course dates will always vary from city to city so content won't be the same everywhere, just slightly the same. It works better this way because HTFU is actually a single page, and we are just passing the venue code to the page to dynamically build the page based on that venue code. So they aren’t technically individual pages, although they seem like that on the web. If we don’t standardize the text, then someone will have to maintain custom text for all active venue codes for all cities for all topics. So you could be talking about over a thousand records to maintain depending on what you want customized. Another option is to have several standardized paragraphs, such as: “Our [Topic Area] training is easy to find in the [City, State] area. Followed by other content specific to the location
White Hat / Black Hat SEO | | CSawatzky
“Find your [Topic Area] training course in [City, State] with ease.” Followed by other content specific to the location Then we could randomize what is displayed. The key is to have a standardized format so additional work doesn’t have to be done to maintain custom formats/text for individual pages. So, mozzers, my question to you all is, can we standardize with slight variations specific to that location and topic area w/o getting getting dinged for spam or duplicate content. Often times I ask myself "if Matt Cutts was standing here, would he approve?" For this, I am leaning towards "yes," but I always need a gut check. Sorry for the long message. Hopefully someone can help. Thank you! Pedram1 -
Powered by/Credit backlinks and nofollow
Pseudo question: I have a website that has 100K pages. On about 50K of those pages I have information that is fed to me via an outside 3rd-party website. Now, I like to give credit where credit is due, so I add a backlink to the website that is feeding me this content. A simple backlink like so: Information provided by: Company ABC Now, this 3rd-party website wants me to remove the nofollow tags from the backlink, but I am very, very skeptical because to me, sending ~50K dofollow backlinks to a single site might make the Google monster upset with me. This 3rd-party site is being very hard-headed about this, to the point where I am thinking of terminating the relationship all together. I digress. Scoured the net before writing this, but couldn't really find anything directly related to my issue. Thoughts? Is a nofollow required here? We're not talking 1 or 2 links here; we're talking tens of thousands (50K is low; it will probably be upwards of 100K when all is said and done as my site has many, many pages). Thanks in advance.
White Hat / Black Hat SEO | | THB0