Recovering from Black Hat/Negative SEO with a twist
-
Hey everyone,
This is a first for me, I'm wondering if anyone has experienced a similar situation and if so, what the best course of action was for you.
Scenario
- In the process of designing a new site for a client, we discovered that his previous site, although having decent page rank and traffic had been hacked. The site was built on Wordpress so it's likely there was a vulnerability somewhere that allowed someone to create loads of dynamic pages; www.domain.com/?id=102, ?id=103, ?id=104 and so on. These dynamic pages ended up being malware with a trojan horse our servers recognized and subsequently blocked access to.
We have since helped them remedy the vulnerability and remove the malware that was creating these crappy dynamic pages.
- Another automated program appears to have been recently blasting spam links (mostly comment spam and directory links) to these dynamically created pages at an incredibly rapid rate, and is still actively doing so. Right now we're looking at a small business website with a touch over 500k low-quality spammy links pointing to malware pages from the previously compromised site.
Important: As of right now, there's been no manual penalty on the site, nor has a "This Site May Have Been Compromised" marker in the organic search results for the site. We were able to discover this before things got too bad for them.
Next Steps?
The concern is that when the Penguin refresh occurs, Google is going to notice all these garbage links pointing to those malware pages and then potentially slap a penalty on the site. The main questions I have are:
- Should we report this proactively to the web spam team using the guidelines here? (https://www.google.com/webmasters/tools/spamreport?hl=en&pli=1)
- Should we request a malware review as recommended within the same guidelines, keeping in mind the site hasn't been given a 'hacked' snippet in the search results? (https://support.google.com/webmasters/topic/4598410?hl=en&ref_topic=4596795)
- Is submitting a massive disavow links file right now, including the 490k-something domains, the only way we can escape the wrath of Google when these links are discovered? Is it too hopeful to imagine their algorithm will detect the negative-SEO nature of these links and not give them any credit?
Would love some input or examples from anyone who can help, thanks in advance!
-
I never mentioned anything about Pigeon?
-
Um....IQ? Did you miss the Pigeon update of a couple of months ago?
Tons of talk on same, my own fav from Mike here -
http://blumenthals.com/blog/2014/10/05/post-pigeon-geo-assessment-how-did-traffic-change-by-city/
-
Should we report this proactively to the web spam team using the guidelines here? No
**Should we request a malware review as recommended within the same guidelines, keeping in mind the site hasn't been given a 'hacked' snippet in the search results? **
No
**Is submitting a massive disavow links file right now, including the 490k-something domains, the only way we can escape the wrath of Google when these links are discovered? Is it too hopeful to imagine their algorithm will detect the negative-SEO nature of these links and not give them any credit? **
Yes
This sounds to me like you need to be thinking 'damage limitation', and by submitting a disavow now, you will be doing just this. Don't worry about the fact there are so many domains there, that is what the tool is all about. However, Penguin hasn't had a refresh in some time (12 months), so one might consider this and think that while you have time on your side to fix it, a refresh could be round the corner - so hop on it
-Andy
-
Sounds like fun!
I did write a lovely answer which unfortunately got lost so I'll summaries a bit below-
1. I wouldn't recommend telling Google as you might not have a penalty now but you might be temping Googles wrath
2. As you've not been marked as malware and you've removed it you should be fine but you can always try if you want to sleep better
3.Disavow proactively is a great idea Google like this approach too, It also means rather than hoping Google might ignore the links its will defiantly ignore them with the disavow list.. Further to this I've got two more options for you. you can block wildcard/dynamic pages in your Robots which will help stop Google even getting to them to find out you've got some bad links assuming you don't need the pages for your site. If you check your referring domains weekly and update the disavow list as well if you're still "under attack".
Just a quick heads up after disavowing the link you may drop down in rankings as you're removing the links however there is also a chance you can go up if you're under a algo penalty.
You can find some good tips here too - http://www.searchenginejournal.com/combat-recover-negative-seo-attack-survival-guide/114507/
Hope some of that helps and I wish I could of posted my reply but I don't have the time to rewrite it I'm afraid. Good luck to you!
-
I have a lot going on right now, but if you PM the domain, I can take a look in a week or so.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
GTLD for SEO?
Hey guys Has there been any case studies or does anyone know how the gTLDs are doing in the SERPS? I never see them in the SERPS but that of course doesn't mean anything. Google is saying that they treat them the same as .com or .net. But does anyone have 'facts'? Cheers!
White Hat / Black Hat SEO | | OscarSE0 -
Competitor Bad Practice SEO Still Ranking Well But Why ?
Moz Friends, A very close competitor have always been challenging for similar competitive keywords. We seem to have the advantage for alot of long tail keywords but on one of the higher traffic relevant keywords they seem to do well. I really struggle to understand why, particularly with the back links they use Just my thoughts and notes on the two: Our Page Better written text content (Maybe slightly written to for experienced target audience but we are working on simplifying things) Good Clear site URL structure and navigation for usability Fresh content updates Mobile optimized Reasonable page speeds Good on-page optimization Good back links from industry influences Competitor Page Negatives Site structure and URL's are inconsistent and messy Lower quality content site wide They use tried and tested on page optimization methods like Keyword spamming, Bold Keywords,Underlining Keywords (Sarcasm) Terrible back links, all directories and free article submission sites (Seriously take a look) Less focused on page optimization Not mobile optimized Most of the rest of the sites carry on the same sort of differences, Engine: www.google.co.uk Keyword: Sound level meters **Our Page: **www.cirrusresearch.co.uk/products/sound-level-meters/ **Competitor Page: **www.pulsarinstruments.com/product-information/Sound-Level-Meter.html Any feedback would be greatly appreciated please, i am really struggling to get my head around this Thanks James
White Hat / Black Hat SEO | | Antony_Towle1 -
What is the difference between Positive Impact, No Impact, Negative Impact and Extremely Negative Impact in term of Google Update like panda or penguin etc.
What is the difference between Positive Impact, No Impact, Negative Impact and Extremely Negative Impact in term of Google Update like panda or penguin etc.
White Hat / Black Hat SEO | | dotlineseo0 -
Subdomain and root domain effects on SEO
I have a domain lets say it's mydomain.com, which has my web app already hosted on this domain. I wanted to create a sub-product from my company, the concept is a bit different than my original web app that is on mydomain.com and I am planning to host this on mynewapp.mydomain.com. I am having doubts that using a sub-domain will have an impact on my existing or new web app. Can anyone give me any pointers on this? As much as I wanted to use a directory mydomain.com/mynewapp, this is not possible because it will just confuse existing users of the new product/web app. I've heard that subdomains are essentially treated as a new site, is this true? If it is then I am fine with this, but is it also true that subdomains are harder to reach the top rank rather than a root domain?
White Hat / Black Hat SEO | | herlamba0 -
SEO Hosting and many dummy blog, is it still works ?
Hi, I want to keep update with the latest trend of SEO,
White Hat / Black Hat SEO | | theconversion
in one side, we know that google now more focus on-page rather than off-page,
the Question is, when we handle a big size of company, that we can't change easily,
the content inside the web, this will become a problem. do you have any suggestion about
this matters ? Second question, I still keep my SEO team to do this, 5 people, 1 people hold 7 .com website,
that hosted all in SEO hosting with different IP Class C, and 7 social network site. Everytime they
are posting a unique article and put link that targetting the money site. My Question is, is it still
work to do all of this ? so at the end of month, we have almost 70links that going to money site,
with composition of link, 50% exact match text, 30 partial match text, and 20% direct www. link today, it is still working, all the site that we optimize, going up to page 1. But I want to know about
the future, or at least in your country that more competitive rather than in my country (Indonesia),
what do you do for backlink that created from your farm. I also heard that, SEO Hosting is not use, the things that works is, when u posting an article, make
sure u post from unique IP address, not from same computer. Please give me enlightment, I truly open for discussion, thanks a lot0 -
Same content, different target area SEO
So ok, I have a gambling site that i want to target for Australia, Canada, USA and England separately and still have .com for world wide (or not, read further).The websites content will basically stays the same for all of them, perhaps just small changes of layout and information order (different order for top 10 gambling rooms) My question 1 would be: How should I mark the content for Google and other search engines that it would not be considered "duplicate content"? As I have mentioned the content will actually BE duplicate, but i want to target the users in different areas, so I believe search engines should have a proper way not to penalize my websites for trying to reach the users on their own country TLDs. What i thought of so far is: 1. Separate webmasterstools account for every domain -> we will need to setup the user targeting to specific country in it.
White Hat / Black Hat SEO | | SEO_MediaInno
2. Use the hreflang tags to indicate, that this content is for GB users "en-GB" the same for other domains more info about it http://support.google.com/webmasters/bin/answer.py?hl=en&answer=189077
3. Get the country specific IP address (physical location of the server is not hugely important, just the IP)
4. It would be great if the IP address for co.uk is from different C-class than the one for the .com Is there anything I am missing here? Question 2: Should i target .com for USA market or is there some other options? (not based in USA so i believe .us is out of question) Thank you for your answers. T0 -
Help for a complete SEO newbie!
Hi all, I've just joined seomoz today to try and further my very young education on SEO. My major problem is i need my site to rank high in local search engines but feel that none of the customers read much of the content as i am a landscaper and feel they just search "landscaping in Newcastle" and are immediatly looking for a contact number to arrange a free estimate. I dont do any online sales, its just to generate leads. I've spent alot of time building a better site than my local competitors but they still out rank me on alot of keywords i.e. "Driveways in Gateshead" My question is do i keep adding more and more content hoping this will work long term or do i link build with anchor text etc or both? I cannot believe they still out rank me when i feel i have more links more anchor text and a load more origional content and images. I think it may be that my site is still under 1 year old. I feel i am boucing from content to link building then trying something else without any real knowlegde of what i really should be doing or what should be the priority at this young stage for my site. I have managed to get on page 1 of google for most of my keywords in local searches ( obviously not national) but still feel its been more down to luck and effort than actually knowing what i am doing when it comes to site and offsite optimization Any help, tips etc would be greatly appreciated. Many thanks John
White Hat / Black Hat SEO | | totaldriveways0 -
Improve CTR with Special Characters in Meta-Description / Title Tags
I've seen this question asked a few times, but I haven't found a definitive answer. I'm quite surprised no one from Google has addressed the question specifically. I ran across this post the other day and it piqued my interest: http://www.datadial.net/blog/index.php/2011/04/13/special-characters-in-meta-descriptions-the-beboisation-of-google/ If you're able to make your result stand out by using stars, smiley faces, TM symbols, etc it would be a big advantage. This is in use currently if you search for a popular mattress keyword in Google. It really is amazing how the special characters draw your attention to the title. You can also see the TM and Copyright symbols if you search for "Logitech Revue" Radioshack is using these characters in their adwords also. Has anyone found any definitive answers to this? Has anyone tracked CTR and long-term results with special characters in title or description tags? Any chance of getting penalized for using this? As a follow-up, it looks like you could also put check symbols into your meta-description tags. That has all kinds of interesting possibilities. http://www.seosmarty.com/special-symbols-wingdings-for-social-media-branding-twitter-linkedin-google-plus/
White Hat / Black Hat SEO | | inhouseninja0