Identifying a Negative SEO Campaign
-
Hi
A friend/clients site has recently dropped 2-3 pages (from an average #2 - #3 position on page 1 over last few months) for a primary target keyword & suspects a Neg SEO campaign hence asked me to look into it.
I checked on Removeem and the KW does not generate a red (or even a pink) result.
I looked at Ahrefs & MajSEO, backlinks and referring domains have dropped over the period the KW dropped hence presume i can be sure its not a neg campaign since this would show an opposite pattern (as per articles like this: http://moz.com/blog/to-catch-a-spammer-uncovering-negative-seo ) ? Also site has very few site wide backlinks.
The keyword is a 3 word phrase with 2 of those words being in the domain and brand name hence presume such kw are relatively safe from neg seo campaigns anyway
I would have presumed the backlink/ref-domain drop may well explain the ranking drop but site still in first field of view of page 1 for the other keyphrases which 2 out of the 3 are words are same as effected keyphrase (and also in the domain/brand name) so would have thought these would have dropped too if a neg campaign. Also many of the anchor texts in the disapeared backlinks are for one of the other partial match variant keyphrases which are still top of page 1.
Anchor text is at 4.35% for the effected kw according to MajSEO
Im pretty confident from the above that i can conclude no negative seo campaign has occurred, nor other type of penalty and probably just a 'wobble' at Google that may well right itself shortly
Would appreciate feedback though from others that im concluding correctly just for confirmation ?
Many Thanks
Dan
-
I would have thought if that were the case would effect the other partial match variation rankings too, which it hasnt since all others are still on page 1 !
The kw in question (along with the other kw still ranking on page 1) are partially matched with the domain
-
Is it possible that there are so few links that losing a few cause a big change in ranking values? Was the keyword in question an exact or partial match with the domain name?
As far as accuracy, it can go either way. Majestic often reports links that existed but are now dead, but it can sometimes be more thorough. The only way to know for sure is to check the links and see if they're still live.
Panda is often site-wide, but it can apply to one or several page/kw combination, or even to all the pages with the same layout in a section.
-
Thanks for taking the time to comment Carson
One of the reasons friend/client has hired me to look into this is because he cant retrieve any link data from GWT. Have asked him to check messages and new manual notifications too.
If Panda surely probs would be site wide not kw/page specific ? Its the HP thats dropped for a primary target kw when all other similar kw rankings for hp are still page 1
Also interestingly there is link growth spike of aprox 1200 backlinks (but only an increase of 4 domains) over 3 days according to MajesticSEO although hrefs reports a link & ref domain drop over same period, any idea which is the most accurate/trustworthy data since directly contradictory date, one of those data sources must be WAY off the mark !?
cheers
dan
-
Panda would be site wide not kw specific !
It was the HP
there is link growth of aprox 1200 backlinks (but only an increase of 4 domains) over 3 days according to MajesticSEO (although hrefs reports a link & ref domain drop over same period)
-
Most claims of negative link efforts are either not link-based penalties, or something self-inflicted. I'd be fascinated to see one, but it doesn't sound like that's what's happening. This is the process I follow in diagnosing penalties:
The very first step is always to check webmaster tools for messages - or now you can check for manual penalties. Second, look at the date of drop and compare it with http://moz.com/google-algorithm-change. Try to figure out when the drop was and whether Google was making any updates at the time.
Link profile: look first for overly-targeted unbranded anchor text first, as you did. Don't forget to pull the link info from Webmaster Tools and check for newer links from Moz newly discovered or ahrefs.
Panda or other site/page quality updates: If you couldn't tie it to an announced Panda release, we just have to guess. Is there a heavy template? Are there a lot of pages targeting very similar terms? Is there any "form-letter-like content"? Is the organic bounce rate/time on site very bad?
Link profile part 2: Look through the linking sites. Check for looking for links that are clearly ads, but lack the nofollow attribute. Start with sites that have been knocked down to PR 0 despite having plenty of links, and look for paid links, especially of the site-wide or overly-targeted variety.
Finally, remember that rankings can fluctuate without it being a penalty.
Google might suddenly "realize" that "Cavendish" is a biker in addition to a type of banana, and might also refer to a philosopher. They'll then push more diversity into the SERP for disambiguation, which will cause rankings to fluctuate wildly. (QDD). Sometimes Google just devalues links that were helping you to rank - it's not a penalty, but it has the same effect. Sometimes we just don't know; the rankings might pop back, and they might not.
If you come up empty after all that diagnosis, you have only one choice: carry on building great content, optimize the design and structure for users, and work on building awareness and authority throughout the industry. It will pass eventually, and you'll come back stronger for having built value and done real marketing.
-
Could it be something else, like a Panda update?
I agree, a typical negative SEO campaign in my mind is a ton of easy to acquire links. I doubt anyone is going to take the time to email webmasters and have links pulled.
I would look at your content stats in GA for YTD and see if you can see any trends for the pages that lost rank (or was it the homepage?).
Unless the negative campaign is targeting individual pages then I would assume the whole site would be affected.
-
Also does anyone know which is the more accurate data source MajSEO or Ahrefs since im getting wildly conflicting data from both. MajSEO now showing 845 links added for early July (which would indicate a neg link campaign) but hrefs shows 345 links lost over same time period ! ?
-
Thanks Irving
do you know if any free versions of linkdetox ?
how will doing a link:(space)www.yourdomain.com search help since results wont highlight site quality will they ?
All Best
Dan
-
Thats a great answer Robert, i really appreciate you taking the time to comment so helpfully
I should have added that there was a big rise in backlinks beginning of may, that peaked and levelled throughout June to then drop from beginning of July to date (according to ahrefs data). So in an otherwise nice natural looking link growth rate from nov last year to date there is a huge hump or wave in the graph as links rise in may but then drop over july.
So if i was looking into this in June it would, initially at least, look like it could well be a neg campaign, but the ranking drop has only occurred recently, correlating with the drop, not the rise, in links. If a neg campaign i would have thought the rank drop occur soon after the spike in link growth, not after a drop in links. Also the link growth period is spread over a month (as is the period of the link drop too), not a few days as article suggest one should look out for in a neg campaign, hence i'm pretty confident that its not (which is why i didnt mention it originally but thought best to now just in case).
When you say look at CTR do you mean purely in regard to traffic from the effected kw in the run up to the rank drop ? What kind of time period do you recommend, a week or more ?
Cheers
Dan
-
main keyword anchor text is sometimes not exploited with a negative SEO attack, it's just a massive amount of links from bad sites which harm the site in general. this can easily be detected with link detox for example or you can do a link:(space)www.yourdomain.com search in Google.
-
Dan,
First, Good job on the linking evaluation, it sounds pretty thorough.
Without a complete picture, it is hard to say, but based solely on what you have here it doesn't appear to be something nefarious. I would add, however, that if I had a site with KW's ranking in spots 2-3 and then was on page 3 to 4 and it lasted for more than a couple of days, I would not lay it off to a Google 'wobble.'
I would suggest looking deeper and seeing what else is going on. Look into analytics and WMT for any trends like falling CTR, etc. Look at changes in query or landing page patterns. Look at the content and then at cached content for same pages to ensure yourself nothing changed on the page and then I would look at competitors who have been consistent during this period for differences between them and your client.
We have all seen a page move up or down in Google for 'no reason' from time to time, but if it is for more than a week, I certainly would be digging up everything. To wait loses too much time if there is a problem.
Best,
Robert
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Domain SEO
Hi, May I know for the keyword "engagement rings" which domain is the best in term of SEO perspective? www.engagement-rings.com www.engagementrings.com 3)www.engagement_rings.com Thank you
White Hat / Black Hat SEO | | KINSHUN1 -
SEO Template Recommendations - example provided but would welcome any advice
Hi there, I'm trying to improve the templates used on our website for SEO pages aimed at popular search terms. An example of our current page template is as follows: http://www.eteach.com/teaching-jobs Our designers have come up with the following new template: http://www.eteach.com/justindaviesnovemeber I know that changing successful pages can be risky. One concern is putting links behind JQuery, where the 'More on Surrey' link is. Does anyone had any strong suggestions or observations around our new template? Especially through the eyes of Google! Thanks in advance Justin
White Hat / Black Hat SEO | | Eteach_Marketing0 -
Are CDN's good or bad for SEO? - Edmonton Web
Hello Moz folks, We just launched a new website: www.edmontonweb.ca It is now ranking on page 2 in our city. The website is built on Wordpress and we have made every effort to make it load faster. We have enabled the right caching and we have reduced the file size. Still, some of our local competitors have lower load times and more importantly lower ttfb's. Is a CDN the right answer? I've read articles demonstrating that Clowd Flare decreased a websites rankings. Is there a better CDN to use, or a propper way to implement Clowd Flare? Thank you very much for your help! Anton,
White Hat / Black Hat SEO | | Web3Marketing87
LAUNCH Edmonton0 -
Does IP Blacklist cause SEO issues?
Hi, Our IP was recently blacklisted - we had a malicious script sending out bulk mail in a Joomla installation. Does it hurt our SEO if we have a domain hosted on that IP? Any solid evidence? Thanks.
White Hat / Black Hat SEO | | bjs20100 -
Is this a clear sign that one of our competitors is doing some serious black-hat SEO?
One of our competitors just recently increased their total external followed looks pretty drastically. Is it safe to say they are doing some pretty black-hat stuff? What actions exactly could this be attributed to? They've been online and in business for 10+ years and I've seen some pretty nasty drops in traffic on compete.com for them over the years. If this is black-hat work in action, would these two things be most likely related? Wh10b97
White Hat / Black Hat SEO | | Kibin0 -
Abused seo unintentionally, now need a way out
Hello, I have been in contact with a smo to optimize my site for search engines and social media sites. my site was doing great from last 4 years. but suddenly it started dropping in ranking. then i came and joined seomoz pro to find a way out. i was suggested to categories content in form of subdomains ... well that put a huge toll on my rankings.. thanks to suggestions here i have 301 them to sub directories. Now another huge question arises. i found out that my smo guy was taking artificial votes or whatever youc all them on twitter, facebook and g+ ...twitter and facebook's are understandable but i am getting to think that these votings on g+ might have affected my site's ranking ? here is a sample url http://www.designzzz.com/cutest-puppy-pictures-pet-photography-tips/ if you scroll below you will see 56 google plus 1s... now the big question is, i have been creating genuince content. but nowt hat i am stuck in this situation, how to get out of it ? changing urls will be bad for readers.. will a 301 will fix it ? or any other method. thanks in advance
White Hat / Black Hat SEO | | wickedsunny10 -
Dust.js Client-side JavaScript Templates & SEO
I work for a commerce company and our IT team is pushing to switch our JSP server-side templates over to client-side templates using a JavaScript library called Dust.js Dust.js is a JavaScript client-side templating solution that takes the presentation layer away from the data layer. The problem with front-end solutions like this is they are not SEO friendly because all the content is being served up with JavaScript. Dust.js has the ability to render your client-side content server-side if it detects Google bot or a browser with JavaScript turned off but I’m not sold on this as being “safe”. Read about Linkedin switching over to Dust.js http://engineering.linkedin.com/frontend/leaving-jsps-dust-moving-linkedin-dustjs-client-side-templates http://engineering.linkedin.com/frontend/client-side-templating-throwdown-mustache-handlebars-dustjs-and-more Explanation of this: “Dust.js server side support: if you have a client that can't execute JavaScript, such as a search engine crawler, a page must be rendered server side. Once written, the same dust.js template can be rendered not only in the browser, but also on the server using node.js or Rhino.” Basically what would be happening on the backend of our site, is we would be detecting the user-agent of all traffic and once we found a search bot, serve up our web pages server-side instead client-side to the bots so they can index our site. Server-side and client-side will be identical content and there will be NO black hat cloaking going on. The content will be identical. But, this technique is Cloaking right? From Wikipedia: “Cloaking is a SEO technique in which the content presented to the search engine spider is different from that presented to the user's browser. This is done by delivering content based on the IP addresses or the User-Agent HTTP header of the user requesting the page. When a user is identified as a search engine spider, a server-side script delivers a different version of the web page, one that contains content not present on the visible page, or that is present but not searchable.” Matt Cutts on Cloaking http://support.google.com/webmasters/bin/answer.py?hl=en&answer=66355 Like I said our content will be the same but if you read the very last sentence from Wikipdia it’s the “present but not searchable” that gets me. If our content is the same, are we cloaking? Should we be developing our site like this for ease of development and performance? Do you think client-side templates with server-side solutions are safe from getting us kicked out of search engines? Thank you in advance for ANY help with this!
White Hat / Black Hat SEO | | Bodybuilding.com0 -
Is our sub-domain messing up our seo for our root?
We have a website (mysite.com) that we control and a subdomain (affiliate.mysite.com) that is 3rd-Party Content completely out of our control. I've found that nearly all or our Crawl Errors are coming from this subdomain. Same deal with 95% of our warnings: they all come from the subdomain. The two website are very much interlinked, as the subdomain serves up the header and footer of the root domain through iFrames and the 3rd-party content in the middle-section. On the root domain there are countless links pointing at this 3rd-party subdomain. How do these errors affect the root domain, and how do you propose we address the issue?
White Hat / Black Hat SEO | | opusbyseo0