Advanced Outside Perspective Requested to Combat Negative SEO
-
**Situation: **We are a digital marketing agency that has been doing SEO for 6 years. For many years, we maintained exceptional rankings and online visibility.However, I suppose with great rankings comes great vulnerability.
Last year, we became the target of a pretty aggressive and malicious negative SEO campaign from another other SEO(s) in our industry - I'm assuming they're competitors.
Overnight, there were 10,000+ links built on various spam domains using the anchor text:
- negative marketing services
- poor seo
- butt crack
- kickass
- ... and more (see attached image)
The issue we face are:
- Time Investment - Enormous investment of time and energy to contact each web admin for link removal.
- Hard to Keep Up - When we think we're getting somewhere, new links come out of the woodwork.
- Disavow Doesn't Work - Though we've tried to generally avoid the disavow tool, we've had to use it for a few domains. However, it's difficult to say how much effect, if any, it's had on the negative links.
As you can imagine, we've seen an enormous drop in organic traffic since this all started.
It's unfortunate that SEO has come to this point, but I still see a lot of value in what we do and hope that spammers don't completely ruin it for us one day.
Moz Community - I come to you seeking some new insight, advice, similar experiences or anything else that may help!
- Are there any other agencies that have experienced the same issue?
- Any new ways to combat really aggressive negative SEO link building?
Thanks everyone!
-
Felip3,
Thanks for the response. I agree - it's too easy for spammers to attack and destroy brands. I especially agree, and have always been proponent, of diversifying traffic sources. Of course, when the majority of website traffic is coming from Organic (as it was for us) and then drops off, it can still leave a bruise.
Thanks,
Mike
-
Chris,
Thanks for the response. We've heard about Link Detox but have never used it because we've never had these issue for our clients. However, the short-term investment for a single account ($299/mo) is nothing when compared to the number of hours we've already put into trying to (somewhat) manually clean up our link profile. I think we'll give it a try and provide an update in a few months.
Thanks!
Mike
-
Unfortunately Google created a easy way for competitors or people with bad intention, destroy other people business. If website "Y" is being outrank by website "Z". Website Y instead of improve their link profile, have the option to buy bad links for the competitor. Also if you offend someone and this person has nothing to do with your industry the person can still drop your ranks paying less than $100 dollars.
Matt Cuts said that a simple disavow would work: http://www.youtube.com/watch?v=znOqJu3le2g
But from what you and many others are saying is not working.
When the upgrade was released Google knew those problems would happens yet, they didn't do anything to avoid the issue. In my humble opinion Google just want sale more PPC.
I think the only way to build a solid web business is having your traffic coming from many sources or you create a single point of failure. This is complex task if you think how big is Google Market share, but as web Marketers we have to figure out a way to do so.
-
Bit of a short response, but have you considered automating this?
We've used link detox: http://www.linkdetox.com for a few of our clients in the same situation and the results have been good.
You can also specify all of your current disavowed links to avoid duplicating work.
Using that tool you could automate the scan and it would generate you a disavow list each time.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
WordPress Tags and SEO
Good Morning everyone, I am trying to decide how I am going to handle an issue on two WordPress websites. I recently acquired 2 new clients that used to do business with the same SEO company. Neither of the clients know of the other but both had the same story about said SEO company. The usual complaints, I wont get into details. My issue is the old SEO company basically was spamming keywords and utilized tags to do this. For each of these clients they had very thin spammy blog posts written and then had a multitude of spammy tags used as keywords here is an example https://captainjacksboatingschool.com/middlesex-county-boating-safety-class/ Each one of these tags is creating duplicate content. How do i properly handle these tags? Do i delete? Do i need to redirect into one main page after deletion. I much rather use plain English and authoritative based Categories. In fact I never use tags, only categories. They do not seem to have much seo value. Both clients who were with this company have the same Tag setup... Any advice would be greatly appreciated as I do not want to loose the customers current rank because i want to do things my way Thanks, Don Silvernail
White Hat / Black Hat SEO | | donsilvernail0 -
How do I deal with Negative SEO (Spammy Links)?
For the past 12 months, our website has been hit by spammy links with annoying anchor text. We suspected one of our competitor are deploying negative SEO on us. The image is an example of the sites and anchor text we have been spammed with. The frequency is about 1 - 2 spammy links a day. I have a few questions from here onwards: Does those links affect our SEO? (Most are mainly nofollow) Other than disavow, what other stuff can I do? How will google and other search engines see this incident? TcmFsti
White Hat / Black Hat SEO | | Changsst0 -
How authentic is a dynamic footer from bots' perspective?
I have a very meta level question. Well, I was working on dynamic footer for the website: http://www.askme.com/, you can check the same in the footer. Now, if you refresh this page and check the content, you'll be able to see a different combination of the links in every section. I'm calling it a dynamic footer here, as the values are absolutely dynamic in this case. **Why are we doing this? **For every section in the footer, we have X number of links, but we can show only 25 links in each section. Here, the value of X can be greater than 25 as well (let's say X=50). So, I'm randomizing the list of entries I have for a section and then picking 25 elements from it i.e random 25 elements from the list of entries every time you're refreshing the page. Benefits from SEO perspective? This will help me exposing all the URLs to bots (in multiple crawls) and will add page freshness element as well. **What's the problem, if it is? **I'm wondering how bots will treat this as, at any time bot might see us showing different content to bots and something else to users. Will bot consider this as cloaking (a black hat technique)? Or, bots won't consider it as a black hat technique as I'm refreshing the data every single time, even if its bot who's hitting me consecutively twice to understand what I'm doing.
White Hat / Black Hat SEO | | _nitman0 -
By changing the wordpress theme what need to take for seo consideration?
Hi guys! we have a site that been using a theme for a year now and we decided to change to a new one, the question here is, does it affect seo? or it is possible to remain 100% for the seo? What caution tips that you guys can share for changing the theme? Does just remaining the same URL works?
White Hat / Black Hat SEO | | andrewwatson922 -
Sitelinks Search Box impact for SEO
I am wondering how the relatively new sitelinks search box impacts the SEO rankings for a specific site or keyword combination - do you guys have any experience or bechmarks on this? Obviously it should help on getting more real estate on the SERP page (due to adding the search box), but do you also get extra goodwill and improved SERP position from adding it? Also, is the impact different on different type of terms, let's say single brand or category term such as "Bestbuy" (or "coupon") or a combination term "Bestbuy Apple" (or "Dixons coupon")? Thanks in advance!
White Hat / Black Hat SEO | | tjr0 -
Are CDN's good or bad for SEO? - Edmonton Web
Hello Moz folks, We just launched a new website: www.edmontonweb.ca It is now ranking on page 2 in our city. The website is built on Wordpress and we have made every effort to make it load faster. We have enabled the right caching and we have reduced the file size. Still, some of our local competitors have lower load times and more importantly lower ttfb's. Is a CDN the right answer? I've read articles demonstrating that Clowd Flare decreased a websites rankings. Is there a better CDN to use, or a propper way to implement Clowd Flare? Thank you very much for your help! Anton,
White Hat / Black Hat SEO | | Web3Marketing87
LAUNCH Edmonton0 -
Negative SEO attack working amazingly on Google.ca
We have a client www.atvandtrailersales.com who recently (March) fell out of the rankings. We checked their backlink file and found over 100 spam links pointing at their website with terms like "uggboots" and "headwear" etc. etc. I submitted a disavow link file, as this was obviously an attack on the website. Since the recent Panda update, the client is back out of the rankings for a majority of keyword phrases. The disavow link file that was submitted back in march has 90% of the same links that are still spamming the website now. I've sent a spam report to Google and nothing has happened. I could submit a new disavow link file, but I'm not sure if this is worth the time. '.'< --Thanks!
White Hat / Black Hat SEO | | SmartWebPros1 -
Dust.js Client-side JavaScript Templates & SEO
I work for a commerce company and our IT team is pushing to switch our JSP server-side templates over to client-side templates using a JavaScript library called Dust.js Dust.js is a JavaScript client-side templating solution that takes the presentation layer away from the data layer. The problem with front-end solutions like this is they are not SEO friendly because all the content is being served up with JavaScript. Dust.js has the ability to render your client-side content server-side if it detects Google bot or a browser with JavaScript turned off but I’m not sold on this as being “safe”. Read about Linkedin switching over to Dust.js http://engineering.linkedin.com/frontend/leaving-jsps-dust-moving-linkedin-dustjs-client-side-templates http://engineering.linkedin.com/frontend/client-side-templating-throwdown-mustache-handlebars-dustjs-and-more Explanation of this: “Dust.js server side support: if you have a client that can't execute JavaScript, such as a search engine crawler, a page must be rendered server side. Once written, the same dust.js template can be rendered not only in the browser, but also on the server using node.js or Rhino.” Basically what would be happening on the backend of our site, is we would be detecting the user-agent of all traffic and once we found a search bot, serve up our web pages server-side instead client-side to the bots so they can index our site. Server-side and client-side will be identical content and there will be NO black hat cloaking going on. The content will be identical. But, this technique is Cloaking right? From Wikipedia: “Cloaking is a SEO technique in which the content presented to the search engine spider is different from that presented to the user's browser. This is done by delivering content based on the IP addresses or the User-Agent HTTP header of the user requesting the page. When a user is identified as a search engine spider, a server-side script delivers a different version of the web page, one that contains content not present on the visible page, or that is present but not searchable.” Matt Cutts on Cloaking http://support.google.com/webmasters/bin/answer.py?hl=en&answer=66355 Like I said our content will be the same but if you read the very last sentence from Wikipdia it’s the “present but not searchable” that gets me. If our content is the same, are we cloaking? Should we be developing our site like this for ease of development and performance? Do you think client-side templates with server-side solutions are safe from getting us kicked out of search engines? Thank you in advance for ANY help with this!
White Hat / Black Hat SEO | | Bodybuilding.com0