Recovering From Black Hat SEO Tactics
-
A client recently engaged my service to deliver foundational white hat SEO. Upon site audit, I discovered a tremendous amount of black hat SEO tactics employed by their former SEO company. I'm concerned that the efforts of the old company, including forum spamming, irrelevant backlink development, exploiting code vulnerabilities on BB's and other messy practices, could negatively influence the target site's campaigns for years to come. The site owner handed over hundreds of pages of paperwork from the old company detailing their black hat SEO efforts. The sheer amount of data is insurmountable. I took just one week of reports and tracked back the links to find that 10% of the accounts were banned, 20% tagged as abusive, some of the sites were shut down completely, WOT reports of abusive practices and mentions on BB control programs of blacklisting for the site.
My question is simple. How does one mitigate the negative effects of old black hat SEO efforts and move forward with white hat solutions when faced with hundreds of hours of black gunk to clean up. Is there a clean way to eliminate the old efforts without contacting every site administrator and requesting removal of content/profiles? This seems daunting, but my client is a wonderful person who got in over her head, paying for a service that she did not understand. I'd really like to help her succeed.
Craig Cook
http://seoptimization.pro
info@seoptimization.pro -
Thanks for the reply Irving. Unfortunately, 99% of the links lead to the home page.
I'm in the process of building quality links from respected directories. This includes filling out full profile information, map info, services, etc. Using original content on everything - rewriting the core keyword heavy company profile with subtle changes on every site. Registered the site for Bing Pro, Google Places, Merchant Circle, Manta.Com, etc... Making sure to tweet and FB the posts to encourage SM participation.
As far as on page optimization, I'm waiting for the developer to roll out the new Joomla 1.7 structure. I've done all of the OPO, like selecting proper htags, generating all new meta-descriptions and titles and reconstructing the Information Architecture in an excel worksheet. I've re-written a good majority of the site content to better reflect the business goals to both users and search engines. I cant wait to implement all of the behind the scenes work and watch the progress when it goes live.
I guess my biggest concern is the long term trustrank implications of the domain being identified as a forum spammer and security exploiter. I started the process of writing letters to the forum masters apologizing for past transgressions and asking for profiles and links to be removed. I've got a 1% success rate on everything I've sent out so far. I think the best that I can hope for at this point is these very old forums upgrade their tech, purge non-participating users, and/or cease operations over the course of time. Moving forward, I'm going to spend less time worrying about these old links, and more time creating a healthy SM presence. Let's hope the search engines don't penalize the site too much for past mistakes...
-
Goog advice,
-
You say "the service was basically a backlink building farm."
If links are your major concern I ask you this - are these links pointing to your homepage or internal pages? If internal pages you can simply change the URL structure and 404 all of the old URLS and the links will no longer be pointing to your site, problem solved. If they did link building to the homepage you would need to go through them and contact the webmasters and ask them to remove which is very time consuming.
That being said, if you don't have a penalty from these links you might be OK, and 404'ing these pages would cut off your PR and could cause a major drop in rankings. Like Alan said, make sure your on page optimization is squeaky clean.
-
Thanks Alan,
The site structure is quite good as is, but we are in the process of updating from CMSMS to Joomla 1.7. In the process, we are addressing all on-page needs, w3c compliance, CSS issues, meta data, converting image text to text and removing all flash.
There were no on-page black hat tactics. The service was basically a backlink building farm. They were very successful in getting certain keywords to 1st page rankings, but left a wake of destruction in their path. We found out that the company was actually sending the work to the Philippines, because the Filipino service provider started direct calling customers when the relationship went south. Turns out the mark-up on the Filipino services was over 1000%. I regret to say that the company providing these black hat services is an Inc. contributor and featured on many news programs including CNN. I won't mention the name of the company, but it turns out the owner is more of a PR guy than a solutions guy, and a few of my current clients have suffered from their fly-by-night operations. Gives all of us a bad name...
-Craig
-
Isthere any sign of penalty, a huge drop in rankings? if not then I would not worry too much about the links, they may be worthless but it si doubtfull they are doing harm,
You can ask gogole for reconsideration from GWMT, but i doubt that is the case unless your ranking has shown a huge drop. but it does not hurt to ask. i would be more worried about on page stuff. make sure your site is not hiiding keywors and the like.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Old subdomains - what to do SEO-wise?
Hello, I wanted the community's advice on how to handle old subdomains. We have https://www.yoursite.org. We also have two subdomains directly related to the main website: https://www.archive.yoursite.org and https://www.blog.yoursite.org. As these pages are not actively updated, they are triggering lots and lots of errors in the site crawl (missing meta descriptions, and much much more). We do not have particular intentions of keeping them up-to-date in terms of SEO. What do you guys think is the best option of handling these? I considered de-indexing, but content of these page is still relevant and may be useful - yet it is not up to date and it will never be anymore. Many thanks in advance.
White Hat / Black Hat SEO | | e.wel0 -
Recovering from Google Penguin/algorithm penalty?
Anyone think recovery is possible? My site has been in Google limbo for the past 8 months to around a year or so. Like a lot of sites we had seo work done a while sgo and had tons of links that Google now looks down on. I worked with an seo company for a few months now and they seem to agree Penguin is the likely culprit, we are on page 8-10 for keywords that we used to be on page 1 for. Our site is informative and has everything in tact. We deleted whatever links possible and some sites are even hard to find contact information for and some sites want money, I paid a few a couple bucks in hopes maybe it could help the process. Anyway we now have around 600 something domains on disavow file we out up in March-April, with around 100 or 200 added recently as well. If need be a new site could be an option as well but will wait and see if the site can improve on Google with a refresh. Anyone think recovery is possible in a situation like this? Thanks
White Hat / Black Hat SEO | | xelaetaks0 -
Looking for recent bad SEO / black hat example such as JC Penney example from 2011
I am giving a presentation in a few weeks and looking for a "what not to do" larger brand example that made poor SEO choices to try and game Google with black hat tactics. Any examples you can point me to?
White Hat / Black Hat SEO | | jfeitlinger0 -
Many Regional Pages: Bad for SEO?
Hello Moz-folks We are relatively well listed for "Edmonton web design." - the city we work out of. As an effort to reach out new clients, we created about 15 new pages targeting other cites in Alberta, BC and Saskatchewan. Although we began to show up quite well in some of these regions, we have recently seen our rankings in Edmonton drop by a few spots. I'm wondering if setting up regional pages that have lots of keywords for that region can be detrimental to our overall rankings.Here is one example of a regional page: http://www.web3.ca/red-deer-web-design Thanks, Anton TWeb3 Marketing Inc.
White Hat / Black Hat SEO | | Web3Marketing870 -
Negative SEO attack working amazingly on Google.ca
We have a client www.atvandtrailersales.com who recently (March) fell out of the rankings. We checked their backlink file and found over 100 spam links pointing at their website with terms like "uggboots" and "headwear" etc. etc. I submitted a disavow link file, as this was obviously an attack on the website. Since the recent Panda update, the client is back out of the rankings for a majority of keyword phrases. The disavow link file that was submitted back in march has 90% of the same links that are still spamming the website now. I've sent a spam report to Google and nothing has happened. I could submit a new disavow link file, but I'm not sure if this is worth the time. '.'< --Thanks!
White Hat / Black Hat SEO | | SmartWebPros1 -
Dust.js Client-side JavaScript Templates & SEO
I work for a commerce company and our IT team is pushing to switch our JSP server-side templates over to client-side templates using a JavaScript library called Dust.js Dust.js is a JavaScript client-side templating solution that takes the presentation layer away from the data layer. The problem with front-end solutions like this is they are not SEO friendly because all the content is being served up with JavaScript. Dust.js has the ability to render your client-side content server-side if it detects Google bot or a browser with JavaScript turned off but I’m not sold on this as being “safe”. Read about Linkedin switching over to Dust.js http://engineering.linkedin.com/frontend/leaving-jsps-dust-moving-linkedin-dustjs-client-side-templates http://engineering.linkedin.com/frontend/client-side-templating-throwdown-mustache-handlebars-dustjs-and-more Explanation of this: “Dust.js server side support: if you have a client that can't execute JavaScript, such as a search engine crawler, a page must be rendered server side. Once written, the same dust.js template can be rendered not only in the browser, but also on the server using node.js or Rhino.” Basically what would be happening on the backend of our site, is we would be detecting the user-agent of all traffic and once we found a search bot, serve up our web pages server-side instead client-side to the bots so they can index our site. Server-side and client-side will be identical content and there will be NO black hat cloaking going on. The content will be identical. But, this technique is Cloaking right? From Wikipedia: “Cloaking is a SEO technique in which the content presented to the search engine spider is different from that presented to the user's browser. This is done by delivering content based on the IP addresses or the User-Agent HTTP header of the user requesting the page. When a user is identified as a search engine spider, a server-side script delivers a different version of the web page, one that contains content not present on the visible page, or that is present but not searchable.” Matt Cutts on Cloaking http://support.google.com/webmasters/bin/answer.py?hl=en&answer=66355 Like I said our content will be the same but if you read the very last sentence from Wikipdia it’s the “present but not searchable” that gets me. If our content is the same, are we cloaking? Should we be developing our site like this for ease of development and performance? Do you think client-side templates with server-side solutions are safe from getting us kicked out of search engines? Thank you in advance for ANY help with this!
White Hat / Black Hat SEO | | Bodybuilding.com0 -
Start over or try to recover?
I have a question about a site that was making good money while it was ranking, but no longer gets traffic. This site did 3 things that might have gotten it in trouble: 1. Targeted keywords often showed up twice in the URL. So the url would be something like http://mydomain.com/keyword/keyword-included-in-title/ 2. It got links from low-quality sites, including blog networks like (the now dead) BMR 3. It got lots of links with the same anchor text The content quality is actually pretty good. I don't know if the site got penalized by Panda, Penguin, or perhaps lost rank because of something else. What I can tell you is that the rank loss was gradual - one page at a time starting at the end of March and ending this month. So the question is - in such a case: Is it best to start over using good SEO practices? Or is there a way to recover the sites?
White Hat / Black Hat SEO | | SiteDeveloper1 -
How to run SEO tests you don't want to be associated with
A client has a competitor who is ranking above them for a highly competitive term they shouldn't really be able to rank for. I think I know how the site got there, and I think I can replicate it myself with a quick test, but it's definitely grey hat if not black hat to do so. I do not want my own sites and company to be damamged by the test, but i'd like to let the client know for sure, and also i'd love to know myself. The test should take about a week to run, there is no hacking involved or password stealing or anything damaging to another. How would you do such a test? I'm dubious about using my own server / site for it, but would a week really matter? Tom
White Hat / Black Hat SEO | | lethal0r0