Does anyone have any suggestions on removing spammy links?
-
My plan is to put all the root domains into http://netpeak.net/software/netpeak-checker/ check for PR main, status code, index, PA and DA. Then put them into Buzzstream which should go out and find the info for you. Then grab all the links from each spammy domain and provide them in the email to the webmaster to make them easier to remove. Hopefully this will make it a little more efficient.
-
I'm just using the free bit myself.
Its pretty new, but seems to work well enough. It may well pull some wrong info (or maybe pulls the info it gets to first)
- for the PR, does it always show the home page PR? Or does it calculate the PR for other pages by subtracting 1 for every click from the home page? I mainly ask so I can respond to client questions if they ever see the tool.
I doubt its that clever, its just aggregating data
S
-
Thanks for sharing this tool Stephen. I watched the video but the site does not share any info about the mechanics of the tool. Some questions:
-
how is the contact info pulled? I am wondering if it sometimes misses info or pulls the wrong info
-
for the PR, does it always show the home page PR? Or does it calculate the PR for other pages by subtracting 1 for every click from the home page? I mainly ask so I can respond to client questions if they ever see the tool.
-
any idea of what Agency pricing is?
I am just asking in case you happen to know some of this info. Otherwise I will reach out to the author.
Thanks again Stephen!
-
-
I've been using http://www.outreachr.com/bulk-domain-checker/ to pull data out of batches of urls for this. It goes and grabs link data from SEOmoz and then has a go at getting contact details including twitter etc
(Hope I don't kill his server while hes on holiday by posting this here)
-
Yes.
In the first case I shared, the client actually performed all the website contacts. I offered guidance on what was required and the client ran with it.
If my team was going to perform the work, I would request a mailbox be set up on the client's domain which we could use for this process.
-
Ryan are you using the client's email address? Seems it may get a better response rate
-
I wouldn't bother doing anything based on PR, would chase all backlinks that may appear in-organic.
-
We are left working with educated guesses. I would recommend a cleanup of spammy links for any client. If the client is currently not penalized, my judgment would focus only on sites listed in WMT which also have over 100 links pointing to the site.
Once the links have been cleaned up, I would check all client sites again after 30 days. Any client who exceeds 90% spam links clearly required further effort. No one knows where the threshold lies, but it's a pretty good guess that if 90% of your links are spammy you are not in a good place.
-
Thanks Ryan you've given me a lot to work with. Hell if I get good at this I might just create a whole new service for my agency lol.
Oh one more question and then I'll leave you alone. What about sites that haven't been hit yet, but have used similar tactics? Would you start this process for them? Or cross fingers?
-
Your process seems sound. A bit of additional feedback:
-
I would complete a Reconsideration Request but then proceed without delay to removing the links. You know the site has spammy links and should be removed.
-
I have no familiarity with Netpeak Checker but I'll take a look at the tool. Otherwise I cannot comment on it.
-
The "resubmit to Google" is not necessary. If they confirm the site has been manually penalized, they are seeking for you to remove all the spammy links. I have talked with others in this situation and Google is quite firm on their desire for you to address 100% of the problem. I would not bother submitting another Reconsideration Request until you have either removed all the manipulative links, or can show solid documentation of your efforts to do so.
Good Luck.
-
-
Yeah this all came right around "Penguin" so I'm fairly certain it's related. They do have a lot of exact anchor text too, but for a wide variety of terms. They were also using blog networks, and have spammy links, so it's really hard to pinpoint which of these or if all of them are the problem.
At any rate should this be my process?
- Resubmit to Google
- See if they answer back and with what
- If no answer proceed with removal
- Get links from webmaster tools
- Parse out Root linking Domains
- Run through Netpeak Checker (awesome tool if you haven't used it) finds PR, SEOmoz stats, Google index, status code, etc.
- First remove all PR 0 and live pages
- Resubmit to Google
- Second remove all deindexed PR 0
- Resubmit to Google
- Get other link source data (Majestic SEO, Opensite Explorer)
- Remove PR 0 links
- Resubmit to Google
Hopefully that will do it. What do you think of this process? Oh and Thank you very much for your help You're awesome.
-
You can complete a Reconsideration Request. In the initial case, Google confirmed there was manual action taken. After the 100+ duplicate sites were taken down, Google then confirmed the remaining issue was due to the manipulative links.
With the recent Penguin update, Google may have automated part of this process into their algorithm.
-
Wow! I just have to give an expanded thanks (we don't have much room in the Endorsement area) for this detailed response. It's great to get some solid information about what it took to get a partial lifting of this penalty. It's certainly one I'll be sending other people to as an example of what to do.
-
**So Ryan in your opinion if they saw some major drops in rankings you would think it would be a safe bet that the site was penalized? **
Not necessarily. There are numerous issues which can cause ranking changes. A page could accidentally be blocked via "noindex" or robots.txt.
Diagnoses of a problem normally requires the highest level of skill. When you go to see a doctor with a problem and he or she can't figure out the cause of the problem, you are stuck....until another doctor comes up with the correct diagnosis. The pharmacy has all the right meds, but a diagnosis is required. The same holds true for SEO. When your business or health is on the line, you don't want to play guessing games.
-
In my opinion, whether Google chooses to index a page or not is not a consideration. You should remove all spammy links. Google could choose to reindex the page at any time and either way, they can still see the page with your link on it.
If anyone else has any solid information on this topic I would love to hear it. Otherwise I vote to play it safe, especially in a penalty situation.
-
Got another question for you. Do we even bother trying to get links from deindexed sites taken off or do you think Google takes those into account with the penalty?
-
So Ryan in your opinion if they saw some major drops in rankings you would think it would be a safe bet that the site was penalized?
They were also using Blog networks that got shut down, so those links have obviously been deindexed and therefore have no value which would drop the rankings anyway. That's the tricky part is the drop in rankings because the blog networks are gone or they are penalized.
-
Hi Ryan,
Great information.
We have had a tug of war with our SEO company who has built "unatural links". They claim it is impossible to do the job.
I wonder if you can explain your line ...if you build links on disposable sites which are not monitored, you clearly wont find help having them removed) and how the links were built." so that I can access how possible it is to get our bad links removed.
-
Thanks for the feedback Robert.
The main site to which I refer had a manual action placed in November 2011. Looking back, I would say it is was a prelude to Penguin. This site exceeded 99% of the links being manipulative so it is pretty clear any reasonable threshold would have been triggered.
What surprised me was how determined Google was about all the links being removed, and the level of documentation required. It is possible I simply received a hard-nosed Google employee but I really trust Google's manual team has a high degree of manual calibration in these cases. I think back to the leaked Google Panda notes and the tests to become a Google tester. They are extremely calibration focused. That's my two cents. It's just speculation but that would be my best guess.
-
Ryan,
This is impressive from the effort point of view alone; what sets it apart is your understanding of the need for documentation if you were to achieve success. So many sites had "SEO" firms do poor linking in the past and there was money to be made by just linking your junk to others. Unfortunately, many of these people went away or are of the type who would never take the time or energy to respond.
It would be interesting to know at what percentage of removal the Manual overseer will deem the site to be sufficiently rehabilitated on two levels:-
The first being the obvious that if a site can rehab to 35% for example the likelihood is google will lift the manual action.
-
The second being that, even at the example percentage of 35%, is it fair to the sites that did not go down that road that the "rehabilitated" site still has 65% of the inorganic links?
A question arises as to what caused the manual action?
Is the action taken as the result of some fixed ratio of organic to inorganic links?
Or, is it at least a varying percentage based on a given industry?
My guess is it is subjective on the part of those attempting to manually validate a huge piece of real estate.
Thanks for the excellent detail, you are truly a champ.
Robert
-
-
Wow great info Ryan. Is there a way to know for sure that a website has been penalized by Google and if this process needs to be started?
-
I have gained a lot of experience cleaning up spammy links over the past 6 months. This task is the most time consuming and unrewarding task in SEO. It is also necessary if your site has been manually penalized for "inorganic" links.
Does anyone have any suggestions on getting these removed?
I worked with Google on this topic for a client. My client's site was manually penalized specifically for "inorganic links". The client is an industry leader in their niche doing about $20 million in sales per year. They had millions of manipulative links pointed to their site from over 1000 linking root domains.
Step one: Google was absolutely firm in their expectation the links be removed prior to the penalty being lifted. This client had over 100 websites which were various forms of their main site (i.e. duplicate content). All the duplicate content sites were removed except the legitimate, human translated language variations of the site. We reported to Google these efforts which resulted in about 97% of the links being removed. Google responded that it was not enough and they required the links from the other external sites to be removed.
Step two of the process: we created an Excel spreadsheet to contact the sites giving priority to the sites with the most links. We tracked the following information: date of contact, initials of employee who performed contact, URL of domain, method of contact (e-mail / phone / contact us page), we provided a link to a copy of each e-mail we sent (see notes below), the date a response was received (if any), and a confirmation of whether the link was still visible.
Regarding the e-mails which were sent, they were very polite customized letters for each site. The letter format was as follows: introduction, description of the problem (i.e. our website has been penalized by Google...), the request to remove links, the location (URL) of all known links within their domain, and we thanked them for their efforts.
The results were we contacted hundreds of domains. The response rate was 14%. In this case, the company had these links built by another "SEO company" mostly between 2007 - 2009.
We reported our results to Google, shared the documentation and their response was:
"Thank you for your request and all of the follow up analysis. We've reviewed your case again, and unfortunately there are still many inorganic links pointing to the site. For example:..."
That led to step three: we went back to the original list of linking sites. For each and every site we covered four methods of contact: e-mail (if their address could be located), phone call (if a phone number could be located), contact us page (if the site offered one) and we looked up their WHOIS information and used that method of contact if the information was different then what was previously available.
Additionally, we went ahead and completed the list of contacting EVERY site who showed a link in Google WMT, even the hundreds of sites with only a single link. We knew our efforts would fail (14% rate of success) prior to starting so our focus was providing solid documentation. If you named a link we could present a copy of an e-mail request sent to remove the link, the date/time of when it was sent along with who sent it. That was the goal.
After submitting this final information to Google, they "partially" removed the manual penalty. The site seemed to rank normally but not as well as before. Google's response:
"Hello Ryan,
Thank you for your follow up email and all of the information provided. The documentation you provided was very helpful in processing and understanding this case.
After re-evaluating your site’s backlinks we are able to partially revoke a manual action. There are still inorganic links pointing to your site that we have taken action on. Once you’ve been able to make further progress in getting these links removed, feel free to reply to this email with the details of your clean-up effort"
Another client was also penalized but they had a single SEO company build most of their inorganic links. In this instance, the SEO company was able to remove almost all the links directly. They had control over many of the linking sites, they had retained their usernames / passwords to forums, etc. The success rate of link removal clearly depends on how long ago the links were built, how spammy the sites are (i.e. if you build links on disposable sites which are not monitored, you clearly wont find help having them removed) and how the links were built.
Good Luck,
-Ryan
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Weird Links Should I Disavow?
I have noticed some weird backlinks in Google Search console and Referals for Google Analytics. For example a reddit page I have never commented on or been on has referred over 900 visitors. The page has no relevance to my site whatsoever, when I check the source code I cannot see the link, so perhaps its been removed. Also seeing links in Google Search Console from sites that are just domain name for sale type pages, and sites/pages that don't seem to exist anymore, or which redirect to others. All of these links have disappeared as well, nothing in source code . And numerous pages that used to link to 404's on my site, many domain name for sale type pages, another which makes my bitdefender plugin go crazy. And seeing common referral patterns in Google Analytics, i.e. numerous /try.php pages on different domains that presumably used to link back but which now redirect to another site. I cannot say there are thousands of these, but I guess they are causing more harm than good. My instinct is to I go through all the links I can and disavow, the link types described above, but am I safe to do so? And is it a good idea or a waste of my time? NB: I haven't built any of them.
White Hat / Black Hat SEO | | GrouchyKids1 -
NoFollow tag for external links: Good or bad?
I have a few sites that have tens of thousands of links on them (most of them are sourcing images that happen to be external links). I know that it's a good thing to externally link to reputable sources, but is it smart to place the nofollow tag on ALL external links? I'm sure there is a good chance that external links from posts from years ago are pointing to sites that may now be penalized. I feel as though nofollowing all the external links could come off as unnatural. What are the pros and cons of placing the nofollow tag on ALL external links, and also if I leave it as is and don't put the nofollow tag on them. Thanks.
White Hat / Black Hat SEO | | WebServiceConsulting.com0 -
Where Is This Link Coming From?
According to Moz Analytics we have a link coming from here: http://www.grayshadowfinancial.com/ The anchor text is earthquake prone map. I can't find the link, but if I cntrl+f "earthquake prone map" it shows in the find box, even though I can't see it. I'm guessing this is some spam tactic and they are hiding this with their CSS? Is there anyway I could see it? Best, Ruben
White Hat / Black Hat SEO | | KempRugeLawGroup0 -
Links and how they count?
We managed to get ourselves out of a penalty 6 months ago and 100 days later after the message of penalty removable we finally felt that we were moving back on track (not a lot of movement before and 50% down due to links being taken away), we have around 120 really high quality links but 95% of them are urls or the business name. Anyway we still have a couple of pages that I feel are fairly down on rankings and most of the links as mentioned above are high quality but they are either anchor text of the website name or url my main question is that when looking at my competitors I see that they have the same or less links and from much less powerful places (most I would not touch) but they seem to have a ratio of 5 - 10 % of the links are the keywords they are trying to rank for. My question is if you have 50 links from better places but they are unrelated terms such as the web site name or just urls and you have 50 links from average places but 5 - 10% are on related terms to what you are trying to rank for which ones would win out.
White Hat / Black Hat SEO | | BobAnderson0 -
Can image links help improve my backlinking profile?
I recently spent some time looking at the backlink profile of a leading UK food & clothing retailer and noticed that a high number of their backlinks for very competitive search phrase's consisted entirely of image backlinks. 50% of the links contained no alt text and other 50% contained a mix of just the targeted keyword or a phase containig one mention of the targeted keyword. Has anyone had any experiance of this type of marketing producing any positive effect on SEO or search engine rankings?
White Hat / Black Hat SEO | | BigJonOne0 -
Is it worth getting links from .blogspot.com and .wordpress.com?
Our niche ecommerce site has only one thing going for it: We have numerous opportunities on a weekly basis to get reviews from "mom bloggers". We need links - our domain authority is depressing. My concern is that these "mom bloggers" tend to have blogs that end with .blogspot.com or .wordpress.com. How do I screen for "reviewers" that are worth getting links from and how can I make the most of the community we have available to us?
White Hat / Black Hat SEO | | Wilkerson1 -
How fast should I make links
I have an eCommerce site. I like to review 100 of my products on Squidoo. There will be 50 lenses each lens will review 2-4 products. Each lens will link to each product review and one link to website URL. at the end of the project I would make around 200-250 links to my site. How should I extent the work. Should I do it within a month? of course I will do my other link buildings along with this task Thanks
White Hat / Black Hat SEO | | giftbasket4kids0 -
Opinions Wanted: Links Can Get Your Site Penalized?
I'm sure by now a lot of you have had a chance to read the Let's Kill the "Bad Inbound Links Can Get Your Site Penalized" Myth over at SearchEngineJournal. When I initially read this article, I was happy. It was confirming something that I believed, and supporting a stance that SEOmoz has taken time and time again. The idea that bad links can only hurt via loss of link juice when they get devalued, but not from any sort of penalization, is indeed located in many articles across SEOmoz. Then I perused the comments section, and I was shocked and unsettled to see some industry names that I recognized were taking the opposite side of the issue. There seems to be a few different opinions: The SEOmoz opinion that bad links can't hurt except for when they get devalued. The idea that you wouldn't be penalized algorithmically, but a manual penalty is within the realm of possibility. The idea that both manual and algorithmic penalties were a factor. Now, I know that SEOmoz preaches a link building strategy that targets high quality back links, and so if you completely prescribe to the Moz method, you've got nothing to worry about. I don't want to hear those answers here - they're right, but they're missing the point. It would still be prudent to have a correct stance on this issue, and I'm wondering if we have that. What do you guys think? Does anybody have an opinion one way or the other? Does anyone have evidence of it being one way or another? Can we setup some kind of test, rank a keyword for an arbitrary term, and go to town blasting low quality links at it as a proof of concept? I'm curious to hear your responses.
White Hat / Black Hat SEO | | AnthonyMangia0