A client/Spam penalty issue
-
Wondering if I could pick the brains of those with more wisdom than me...
Firstly, sorry but unable to give the client's url on this topic. I know that will not help with people giving answers but the client would prefer it if this thread etc didn't appear when people type their name in google.
Right, to cut a long story short..gained a new client a few months back, did the usual things when starting the project of reviewing the backlinks using OSE and Majestic. There were a few iffy links but got most of those removed. In the last couple of months have been building backlinks via guest blogging and using bloggerlinkup and myblogguest (and some industry specific directories found using linkprospector tool). All way going well, the client were getting about 2.5k hits a day, on about 13k impressions. Then came the last Google update. The client were hit, but not massively. Seemed to drop from top 3 for a lot of keywords to average position of 5-8, so still first page. The traffic went down after this. All the sites which replaced the client were the big name brands in the niche (home improvement, sites such as BandQ, Homebase, for the fellow UK'ers). This was annoying but understandable.
However, on 27th June. We got the following message in WMT - Google has detected a pattern of artificial or unnatural links pointing to your site. Buying links or participating in link schemes in order to manipulate PageRank are violations of Google's Webmaster Guidelines.
As a result, Google has applied a manual spam action to xxxx.co.uk/. There may be other actions on your site or parts of your site.This was a shock to say the least. A few days later the traffic on the site went down more and the impressions dropped to about 10k a day (oddly the rankings seem to be where they were after the Google update so perhaps a delayed message).
To get back up to date....after digging around more it appears there are a lot of SENUKE type links to the site - links on poor wiki sites,a lot of blog commenting links, mostly from irrelevant sites, i enclose a couple of examples below. I have broken the links so they don't get any link benefit from this site. They are all safe for work
http:// jonnyhetherington. com/2012/02/i-need-a-new-bbq/?replytocom=984
http:// www.acgworld. cn/archives/529/comment-page-3
In addition to this there is a lot of forum spam, links from porn sites and links from sites with Malware warnings. To be honest, it is almost perfect negative seo!!
I contacted several of the sites in question (about 450) and requested they remove the links, the vast majority of the sites have no contact on them so I cannot get the links removed. I did a disavow on these links and then a reconsideration request but was told that this is unsuccessful as the site still was being naughty.
Given that I can neither remove the links myself or get Google to ignore them, my options for lifting this penalty are limited.
What would be the course of action others would take, please.
Thanks and sorry for overally long post
-
Thanks for the replies everyone, now comes the fun part when I have to crack on and work way through 48,000 backlinks!
-
Yeh, tbh, you don't need to worry too much about nofollow links. The only thing that I would do is check through some of the nofollow links to see if they are all blog comments that have been done with an automated system. If this is the case then there could be duplicate content issues that are leaving a footprint back to your site (i.e. within the spun comment). This isn't a major concern but worth a little look - but as a general rule, you don't need to worry.
Matt
-
I agree with that Carl. It's one thing if your worried that Google might penalize you, maybe you don't worry about the nofollows. However, once Google has already placed a manual penalty on the site, it's all about showing Google that your not trying to game their system and you're working hard to correct the situation. A bunch of links on spammy sites will still look bad to a reviewer even if they are nofollow. I'd try to get them removed as well...though I may not put as high a priority on them.
-Kurt
-
Thanks for the replies everyone, they are most welcome.
If I could trouble you to one sub question before I mark this as solved. When cleaning up a dodgy backlinks profile, what is the general view on no follow links? Going through the client links and they seem to have a fair few no follow links from generic directories. Even though they shouldn't be counting toward a site ranking, I have been asking people to remove these too. My view is that if I remove all the bad links, regardless of follow situation, that will show Google that I know what is right and what is wrong re the site.
Thanks, Carl
-
Yeah, Google definitely wants to see that you've put some effort into removing the links and that you aren't doing it anymore. It's also not uncommon for it to take several requests and several months.
-
No, sorry I may have worded myself poorly...the client used an seo agency until a couple of months back, it seems although a lot of the spam links were posted between Dec and Feb they are only now impacting on the site. When I referred to negative seo, I more meant it as a joke that the links look like the perfect example of a negative seo campaign. Found some forum spam earlier on Arsenal FC forum and a forum about psychological issues faced by transgender people. Both of these sites seemed fine sites in their own right but one would have to question their value when linking to a door handle website!!
The initial (and thus far, only) request was a very basic one to say we have received this penalty, we hired a poor seo company to look at our site and it seems they spammed our domain. I told them I had disavowed several hundred domains but I think it failed owning to the lack of proof of manual work, so, as suggested by Matthew (above) I will include a document this time to show who we contacted, when, the reply and the current link status
-
Yeh, I would recommend using Buzzstream for the data gathering, it saves a heap load of time - I also outsource it to freelancers on oDesk - you can do this for a very l;ow budget and just speeds the whole process up.
With the Link Detox tool, importing all of your other link data is vtial towards getting a good reflection of the links. Good luck
If you get really stuck, email me (my email is on my Moz profile) and I'll help you out more where I can.
Matt
-
In your request to Google, did you explain that you were not building these links and that it appears to be someone performing negative SEO on your client?
-Kurt Steinbrueck
OurChurch.Com -
Matthew,
Many thanks for the detailed reply. Shortly ago I used the linkdetox tool, I didn't realise you can upload files to it so used their built in bad link identifier. It has given me about 1800 bad links which am working though. Helpfully a few of them are blogger sites and have no contact!! Am managing to contact about 30% so far so that's better than nothing.
I have read about using buzzstream.com to try and pull the contact information on the other domains, I will employ this once I have finished going through the list. So far I have documented the urls and contact times in a spreadsheet. I must admit I didn't know you could link to a Google doc in the reconsideration so the spreadsheet I am working through will provide a good start, especially if the removed column starts to fill up!!
Thanks again
-
Hi Carl,
First step is to identify all of the links. Pull off the full backlink data from OSE, Majestic SEO, Ahrefs and WMT. Compile all of them into one master spreadsheet and then upload these to the Link Detox Tool (http://www.linkdetox.com/). This will give you a starting point for finding all of the toxic links - bare in mind that this is just a guide and you will still need to go over the link manually.
Start gathering webmaster details and record EVERYTHING in a Google Docs spreadsheet. Record the webmasters' contact details, URL, date you contacted them, the date of the response, any action taken, etc. Spend a good month on link removal to get as many removed manually as possible.
Once this stage is complete you will need to Disavow the rest of the links. Be careful here not to Disavow genuine good links. When it comes to the likes of SENuke links, you will want to Disavow them on domain level, i.e:
domain:jonnyhetherington. com
After you have submitted a Disavow, submit a reconsideration request and let Google know all of the bad links that were pointing to the site, why they were there and what you have done to rectify it - be explicit. Also, link to the Google Docs spreadsheet with all the details in.
If you get a negative response back then dig a little deeper with the links to Disavow - most reconsideration requests get knocked back the first time but ignore those that say 'you can't recover', because you can. Just make sure that your client understands the implications of everything. They will have further dips in rankings and traffic before it gets better.
Hope this gives you a good starting point.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Hreflang/Canonical Inquiry for Website with 29 different languages
Hello, So I have a website (www.example.com) that has 29 subdomains (es.example.com, vi.example.com, it.example.com, etc). Each subdomain has the exact same content for each page, completely translated in its respective language. I currently do not have any hreflang/canonical tags set up. I was recently told that this (below) is the correct way to set these tags up -For each subdomain (es.example.com/blah-blah for this example), I need to place the hreflang tag pointing to the page the subdomain is on (es.example.com/blah-blah), in addition to every other 28 subdomains that have that page (it.example.com/blah-blah, etc). In addition, I need to place a canonical tag pointing to the main www. version of the website. So I would have 29 hreflang tags, plus a canonical tag. When I brought this to a friends attention, he said that placing the canonical tag to the main www. version would cause the subdomains to drop out of the SERPs in their respective country search engines, which I obviously wouldn't want to do. I've tried to read articles about this, but I end up always hitting a wall and further confusing myself. Can anyone help? Thanks!
White Hat / Black Hat SEO | | juicyresults0 -
Image Optimization & Duplicate Content Issues
Hello Everyone, I have a new site that we're building which will incorporate some product thumbnail images cut and pasted from other sites and I would like some advice on how to properly manage those images on our site. Here's one sample scenario from the new website: We're building furniture and the client has the option of selecting 50 plastic laminate finish options from the Formica company. We'll cut and paste those 50 thumbnails of the various plastic laminate finishes and incorporate them into our site. Rather than sending our website visitors over to the Formica site, we want them to stay put on our site, and select the finishes from our pages. The borrowed thumbnail images will not represent the majority of the site's content and we have plenty of our own images and original content. As it does not make sense for us to order 50 samples from Formica & photograph them ourselves, what is the best way to handle to issue? Thanks in advance, Scott
White Hat / Black Hat SEO | | ccbamatx0 -
80% of traffic lost over night, Google Penalty?
Hi all.
White Hat / Black Hat SEO | | Hemjakt
I have a website called Hemjakt (http://www.hemjakt.se/) which is a search engine for real estate currently only available on the Swedish market. The application crawl real estate websites and collect all estates on a single searchable application. The site has been released for a few months and have seen a steady growth since release, increasing by 20% weekly up to ~900 visitors per day. 3 days ago, over night, I lost 80% of my traffic. Instead of 900 visitors per day I'm at ~100 visitors per day and when I search for long, specific queries such as "Åsgatan 15, Villa 12 rum i Alsike, Knivsta" ( <adress><house type=""><rooms><area> <city>), I'm now only found on the fifth page. I suspect that I have become a subject of a Google Penalty. How to get out of this mess?</city></rooms></house></adress> Just like all search engines or applications, I do crawl other websites and scrape their content. My content is ~90% unique from the source material and I do add user value by giving them the possibility to compare houses, get ton of more data to compare pricing and history, giving them extra functionalities that source site do not offer and so on. My analytics data show good user engagement. Here is one example of a Source page and a page at my site:
Source: http://www.hemnet.se/bostad/villa-12rum-alsike-knivsta-kommun-asgatan-15-6200964
My Site: http://www.hemjakt.se/bostad/55860-asgatan-15/ So: How do I actually confirm that this is the reason I lost my traffic? When I search for my branded query, I still get result. Also I'm still indexed by Google. If I am penalized. I'm not attempting to do anything Black Hat and I really believe that the app gives a lot of value to the users. What tweaks or suggestions do you have to changes of the application, to be able to continue running the service in a way that Google is fine with?0 -
Google messages & penalties
I just read the following comment in a response to someone else's question. The Responer is an SEOMoz Authority whose opinion I respect and have learned from (not sure if it's cool to mention names in a question) and it spurred my curiosity: "...Generally you will receive a warning from Google before your site is penalized, unless you are talking about just specific keywords." This is something I have been wondering about in relation to my own sudden ranking drop for 2 specific keywords as I did not receive any warnings or notices. I have been proceeding as if I had over used these keywords on my Home page due to an initial lesser drop, but identifying the cause for the huge drop still seems useful for a number of reasons. Can anyone explain this further?
White Hat / Black Hat SEO | | gfiedel0 -
Would you consider this keyword spam?
See these pages that we've created to rank. There are 3 types: Designed to be topic-specific:
White Hat / Black Hat SEO | | Mase
https://www.upcounsel.com/lawyers/trademark Designed to be location-specific:
https://www.upcounsel.com/lawyers/san-francisco Designed to be a combo of both topic & location:
https://www.upcounsel.com/lawyers/san-francisco-real-estate Are the keywords at the bottom too many and considered keyword spam? Any other SEO tips on these pages? I'm thinking about making them a bit more hierarchical, so there can be breadcrumbs and you could click back to San Francisco Lawyers from San Francisco Real Estate Lawyers. Good examples of sites that have dome structures like this really well?0 -
Index page de-indexed / banned ?
Yesterday google removed our index page from the results. Today they also removed language subdomains (fr.domain.com).. Index page, subdomains are not indexed anymore. Any suggestions? -- No messages in GWT. No malware. Backlink diversification was started in May. Never penguilized or pandalized. Last week had the record of all times of daily UV. Other pages still indexed and driving traffic, left around 40% of total. Never used any black SEO tool. 95% of backlinks are related; sidebar, footer links No changes made of index page for couple months.
White Hat / Black Hat SEO | | bele0 -
Interesting case of IP-wide Google Penalty, what is the most likely cause?
Dear SEOMOZ Community, Our portfolio of around 15 internationalized web pages has received a significant, as it seems IP-wide, Google penalty starting November 2010 and have yet to recover from it. We have undergone many measure to lift the penalty including reconsideration requests wo/ luck and am now hoping the SEOMoz community can give us some further tips. We are very interested in the community's help and judgement what else we can try to uplift the penalty. As quick background information, The sites in question offers sports results data and is translated for several languages. Each market, equals language, has its own tld domain using the central keyword, e.g. <keyword_spanish>.es <keyword_german>.de <keyword_us>.com</keyword_us></keyword_german></keyword_spanish> The content is highly targeted around the market, which means there are no duplicate content pages across the domains, all copy is translated, content reprioritized etc. however the core results content in the body of the pages obviously needs to stay to 80% the same A SEO agency of ours has been using semi-automated LinkBuilding tools in mid of 2010 to acquire link partnerships There are some promotional one-way links to sports-betting and casino positioned on the page The external linking structure of the pages is very keyword and main-page focused, i.e. 90% of the external links link to the front page with one particular keyword All sites have a strong domain authority and have been running under the same owner for over 5 years As mentioned, we have experienced dramatic ranking losses across all our properties starting in November 2010. The applied penalties are indisputable given that rankings dropped for the main keywords in local Google search engines from position 3 to position 350 after the sites have been ranked in the top 10 for over 5 years. A screenshot of the ranking history for one particular domain is attached. The same behavior can be observed across domains. Our questions are: Is there something like an IP specific Google penalty that can apply to web properties across an IP or can we assume Google just picked all pages registered at Google Webmaster? What is the most likely cause for our penalty given the background information? Given the drops started already in November 2010 we doubt that the Panda updates had any correlation t this issue? What are the best ways to resolve our issues at this point? We have significant history data available such as tracking records etc. Our actions so far were reducing external links, on page links, and C-class internal links Are there any other factors/metrics we should look at to help troubleshooting the penalties? After all this time wo/ resolution, should we be moving on two new domains and forwarding all content as 301s to the new pages? Are the things we need to try first? Any help is greatly appreciated. SEOMoz rocks. /T cxK29.png
White Hat / Black Hat SEO | | tomypro0 -
Which of these elements are good / bad link building practices?
Hi, I need some help. I recently got some help with an seo project from a contractor. He did 50 directory submissions and 50 article submissions. I got good results, going up about 20 places (still a long way to the first page!) on google.co.uk on a tough key word Since this project I learned article marketing is not cool. So I am wondering about what I should do next. The contractor has proposed a new bigger project consisting of the elements listed below. I don’t know which of these elements are ok and which aren’t. If they are not ok are they: 1) a waste of time or 2) something I could get penalized for? Let me know what you think?? Thanks, Andrew 100 ARTICLE SUBMISSIONS [APPROVED ARTICLES] -> 1 article submitted to 100 article directories 50 PRESS RELEASE SUBMISSIONS [APPROVED & SCREENSHOTS]-> 1 PR writing & submissions to top 50 PR distribution sites each 150 PRIVATE BLOGS SUBMISSION [APPROVED ARTICLES] -> 1 article submitted to 150 private blogs submission 100 WEBSITE DIRECTORY SUBMISSION -> 1 url (home page) submitted to 100 top free web directories 50 SOCIAL BOOKMARKING [CONFIRMED LINKS] -> 1 url of site submitted to top 50 social bookmarking websites 40 PROFILE BACK-LINKS [CONFIRMED LINKS] -> 1-3 url's of site submitted and create 40 profile websites 50 SEARCH ENGINES -> submission to all the major search engines 20 NEWS WEBSITES -> Ping all links from reports to news websites
White Hat / Black Hat SEO | | fleurya0