Advanced Outside Perspective Requested to Combat Negative SEO
-
**Situation: **We are a digital marketing agency that has been doing SEO for 6 years. For many years, we maintained exceptional rankings and online visibility.However, I suppose with great rankings comes great vulnerability.
Last year, we became the target of a pretty aggressive and malicious negative SEO campaign from another other SEO(s) in our industry - I'm assuming they're competitors.
Overnight, there were 10,000+ links built on various spam domains using the anchor text:
- negative marketing services
- poor seo
- butt crack
- kickass
- ... and more (see attached image)
The issue we face are:
- Time Investment - Enormous investment of time and energy to contact each web admin for link removal.
- Hard to Keep Up - When we think we're getting somewhere, new links come out of the woodwork.
- Disavow Doesn't Work - Though we've tried to generally avoid the disavow tool, we've had to use it for a few domains. However, it's difficult to say how much effect, if any, it's had on the negative links.
As you can imagine, we've seen an enormous drop in organic traffic since this all started.
It's unfortunate that SEO has come to this point, but I still see a lot of value in what we do and hope that spammers don't completely ruin it for us one day.
Moz Community - I come to you seeking some new insight, advice, similar experiences or anything else that may help!
- Are there any other agencies that have experienced the same issue?
- Any new ways to combat really aggressive negative SEO link building?
Thanks everyone!
-
Felip3,
Thanks for the response. I agree - it's too easy for spammers to attack and destroy brands. I especially agree, and have always been proponent, of diversifying traffic sources. Of course, when the majority of website traffic is coming from Organic (as it was for us) and then drops off, it can still leave a bruise.
Thanks,
Mike
-
Chris,
Thanks for the response. We've heard about Link Detox but have never used it because we've never had these issue for our clients. However, the short-term investment for a single account ($299/mo) is nothing when compared to the number of hours we've already put into trying to (somewhat) manually clean up our link profile. I think we'll give it a try and provide an update in a few months.
Thanks!
Mike
-
Unfortunately Google created a easy way for competitors or people with bad intention, destroy other people business. If website "Y" is being outrank by website "Z". Website Y instead of improve their link profile, have the option to buy bad links for the competitor. Also if you offend someone and this person has nothing to do with your industry the person can still drop your ranks paying less than $100 dollars.
Matt Cuts said that a simple disavow would work: http://www.youtube.com/watch?v=znOqJu3le2g
But from what you and many others are saying is not working.
When the upgrade was released Google knew those problems would happens yet, they didn't do anything to avoid the issue. In my humble opinion Google just want sale more PPC.
I think the only way to build a solid web business is having your traffic coming from many sources or you create a single point of failure. This is complex task if you think how big is Google Market share, but as web Marketers we have to figure out a way to do so.
-
Bit of a short response, but have you considered automating this?
We've used link detox: http://www.linkdetox.com for a few of our clients in the same situation and the results have been good.
You can also specify all of your current disavowed links to avoid duplicating work.
Using that tool you could automate the scan and it would generate you a disavow list each time.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
The Importance of Bold Keywords in SEO?
Hi all, Recently I came cross an RV lifestyle blog named RVing Trends. The website features high-quality contents about handy RV camping tips & guides, and in-depth RV product reviews. They seem to spend a lot of effort on the content quality. I've followed this website for a few months and can see they've been producing 3,000-5,000 word length contents regularly. One thing I notice is that they emphasize the main keyword as bold in almost the posts. You can check 1 sample here about RV mattress reviews. Just want to ask for your opinions about the efficiency of this technique and is the keyword density still important for blog content to rank well in Google. Thanks!
White Hat / Black Hat SEO | | TungNM1930 -
Do home page carry more seo benefit than other pages?
hi, i would like to include my kws in the URL and they are under 50 characters. is there anything in the algo that tells engines to give more importance to homepage?
White Hat / Black Hat SEO | | alan-shultis0 -
Is linking out to different websites with the same C-Block IP bad for SEO?
Many SEOs state that getting (too many) links from the same C-Block IP is bad practice and should be avoided. Is this also applicable if one website links out to different websites with the same C-Block IP? Thus, website A, B and C (on the same server) link to website D (different server) could be seen as spam but is this the same when website D links to website A, B and C?
White Hat / Black Hat SEO | | TT_Vakantiehuizen0 -
Press Releases and SEO in 2013
Mozers, A few questions for the community: Distributing a press release through a service like 24-7pressrelease.com - is it a serious duplicate content issue when an identical press release is distributed to multiple sites with no canonical markup (as far as I can tell)? All of the backlinks in the press release are either nofollow or redirects. If there IS a duplicate content issue, will the website be affected negatively given the numerous Panda and Penguin refreshes? Why SHOULDN'T a company issue a press release to multiple sites if it actually has something legitimate to announce and the readership of a given site is the target demographic? For example, why shouldn't a company that manufactures nutritional health supplements issue the same press release to Healthy Living, Lifestyle, Health News, etc _with a link to the site?_I understand it's a method that can be exploited for SEO purposes, but can't all SEO methods be taken to an extreme? Seems to me that if this press release scenario triggers the duplicate content and/or link spam penalty(ies), I'd consider it a slight deficiency of Google's search algorithm. Any insight would be much appreciated. Thanks.
White Hat / Black Hat SEO | | b40040400 -
SEO problems with PR Newswires
Just been investigating PR newswires for the first time (despite having worked in PR for over a decade!) One of my clients has asked my to send out a news release via a newswire of my choice. I will not be posting the news release on my client's website, to avoid the most obvious duplication issue. Has anyone had SEO probs from newswires though? I just saw one which offered: "Minimum guaranteed number of media websites on which your release is posted" alarm bells!
White Hat / Black Hat SEO | | McTaggart0 -
Dust.js Client-side JavaScript Templates & SEO
I work for a commerce company and our IT team is pushing to switch our JSP server-side templates over to client-side templates using a JavaScript library called Dust.js Dust.js is a JavaScript client-side templating solution that takes the presentation layer away from the data layer. The problem with front-end solutions like this is they are not SEO friendly because all the content is being served up with JavaScript. Dust.js has the ability to render your client-side content server-side if it detects Google bot or a browser with JavaScript turned off but I’m not sold on this as being “safe”. Read about Linkedin switching over to Dust.js http://engineering.linkedin.com/frontend/leaving-jsps-dust-moving-linkedin-dustjs-client-side-templates http://engineering.linkedin.com/frontend/client-side-templating-throwdown-mustache-handlebars-dustjs-and-more Explanation of this: “Dust.js server side support: if you have a client that can't execute JavaScript, such as a search engine crawler, a page must be rendered server side. Once written, the same dust.js template can be rendered not only in the browser, but also on the server using node.js or Rhino.” Basically what would be happening on the backend of our site, is we would be detecting the user-agent of all traffic and once we found a search bot, serve up our web pages server-side instead client-side to the bots so they can index our site. Server-side and client-side will be identical content and there will be NO black hat cloaking going on. The content will be identical. But, this technique is Cloaking right? From Wikipedia: “Cloaking is a SEO technique in which the content presented to the search engine spider is different from that presented to the user's browser. This is done by delivering content based on the IP addresses or the User-Agent HTTP header of the user requesting the page. When a user is identified as a search engine spider, a server-side script delivers a different version of the web page, one that contains content not present on the visible page, or that is present but not searchable.” Matt Cutts on Cloaking http://support.google.com/webmasters/bin/answer.py?hl=en&answer=66355 Like I said our content will be the same but if you read the very last sentence from Wikipdia it’s the “present but not searchable” that gets me. If our content is the same, are we cloaking? Should we be developing our site like this for ease of development and performance? Do you think client-side templates with server-side solutions are safe from getting us kicked out of search engines? Thank you in advance for ANY help with this!
White Hat / Black Hat SEO | | Bodybuilding.com0 -
Negative SEO on my website with paid +1's
Hi guys, I need a piece of advice. Some scumbag played me quite well with paid +1's on my two articles and now I'm in a problem.
White Hat / Black Hat SEO | | Fastbridge
http://sr.stateofseo.com/seo-vesti/google-implementacija-ssl-protokola-not-provided-problem/
http://sr.stateofseo.com/napredni-seo/najnovije-promene-google-panda-algoritma/
They are both translated articles (written originally by me on the same website). I've noticed those +1's (476 on both articles) when my website received a penalty for "SEO" keyword on Google.rs (Serbian Google) and I'm now on the 11th page.
Other keywords still rank just fine. Not cool, right? Now, I think there could be two solutions:
First one is to remove my inner link that's pointing to my homepage with "SEO" anchor, and hope for the best. Second one is to completely remove/delete those two articles and wait for Google to reindex the website and hopefully remove my ban. Do you guy have some other ideas how can I fix this or remove / disavow those +1 or somehow explain to the Google crew / algo that I'm just a humble SEO without any evil thoughts? 🙂 Thank you in advance.0 -
Got an SEO package, paid $400+ for it, basically got scammed.
Hi guys, I know this is stupid but I bought an SEO package for around $400. Received the report, and my... it was a complete load of spam. It was basically a blast to lots of sites with random articles and my anchor texts all over the place. Theres thousands of these links and the articles dont make sense, I'm not sure what i'm going to do! This is my main Ecommerce website and i'm worried, i've complained and I hope to get a refund however i'm worried hes going to just blast my site and get me penalized by Google. It is clearly blackhat. Is there anything I can do? I'm very worried. Thanks
White Hat / Black Hat SEO | | Superinks0