Does anyone have any suggestions on removing spammy links?
-
My plan is to put all the root domains into http://netpeak.net/software/netpeak-checker/ check for PR main, status code, index, PA and DA. Then put them into Buzzstream which should go out and find the info for you. Then grab all the links from each spammy domain and provide them in the email to the webmaster to make them easier to remove. Hopefully this will make it a little more efficient.
-
I'm just using the free bit myself.
Its pretty new, but seems to work well enough. It may well pull some wrong info (or maybe pulls the info it gets to first)
- for the PR, does it always show the home page PR? Or does it calculate the PR for other pages by subtracting 1 for every click from the home page? I mainly ask so I can respond to client questions if they ever see the tool.
I doubt its that clever, its just aggregating data
S
-
Thanks for sharing this tool Stephen. I watched the video but the site does not share any info about the mechanics of the tool. Some questions:
-
how is the contact info pulled? I am wondering if it sometimes misses info or pulls the wrong info
-
for the PR, does it always show the home page PR? Or does it calculate the PR for other pages by subtracting 1 for every click from the home page? I mainly ask so I can respond to client questions if they ever see the tool.
-
any idea of what Agency pricing is?
I am just asking in case you happen to know some of this info. Otherwise I will reach out to the author.
Thanks again Stephen!
-
-
I've been using http://www.outreachr.com/bulk-domain-checker/ to pull data out of batches of urls for this. It goes and grabs link data from SEOmoz and then has a go at getting contact details including twitter etc
(Hope I don't kill his server while hes on holiday by posting this here)
-
Yes.
In the first case I shared, the client actually performed all the website contacts. I offered guidance on what was required and the client ran with it.
If my team was going to perform the work, I would request a mailbox be set up on the client's domain which we could use for this process.
-
Ryan are you using the client's email address? Seems it may get a better response rate
-
I wouldn't bother doing anything based on PR, would chase all backlinks that may appear in-organic.
-
We are left working with educated guesses. I would recommend a cleanup of spammy links for any client. If the client is currently not penalized, my judgment would focus only on sites listed in WMT which also have over 100 links pointing to the site.
Once the links have been cleaned up, I would check all client sites again after 30 days. Any client who exceeds 90% spam links clearly required further effort. No one knows where the threshold lies, but it's a pretty good guess that if 90% of your links are spammy you are not in a good place.
-
Thanks Ryan you've given me a lot to work with. Hell if I get good at this I might just create a whole new service for my agency lol.
Oh one more question and then I'll leave you alone. What about sites that haven't been hit yet, but have used similar tactics? Would you start this process for them? Or cross fingers?
-
Your process seems sound. A bit of additional feedback:
-
I would complete a Reconsideration Request but then proceed without delay to removing the links. You know the site has spammy links and should be removed.
-
I have no familiarity with Netpeak Checker but I'll take a look at the tool. Otherwise I cannot comment on it.
-
The "resubmit to Google" is not necessary. If they confirm the site has been manually penalized, they are seeking for you to remove all the spammy links. I have talked with others in this situation and Google is quite firm on their desire for you to address 100% of the problem. I would not bother submitting another Reconsideration Request until you have either removed all the manipulative links, or can show solid documentation of your efforts to do so.
Good Luck.
-
-
Yeah this all came right around "Penguin" so I'm fairly certain it's related. They do have a lot of exact anchor text too, but for a wide variety of terms. They were also using blog networks, and have spammy links, so it's really hard to pinpoint which of these or if all of them are the problem.
At any rate should this be my process?
- Resubmit to Google
- See if they answer back and with what
- If no answer proceed with removal
- Get links from webmaster tools
- Parse out Root linking Domains
- Run through Netpeak Checker (awesome tool if you haven't used it) finds PR, SEOmoz stats, Google index, status code, etc.
- First remove all PR 0 and live pages
- Resubmit to Google
- Second remove all deindexed PR 0
- Resubmit to Google
- Get other link source data (Majestic SEO, Opensite Explorer)
- Remove PR 0 links
- Resubmit to Google
Hopefully that will do it. What do you think of this process? Oh and Thank you very much for your help You're awesome.
-
You can complete a Reconsideration Request. In the initial case, Google confirmed there was manual action taken. After the 100+ duplicate sites were taken down, Google then confirmed the remaining issue was due to the manipulative links.
With the recent Penguin update, Google may have automated part of this process into their algorithm.
-
Wow! I just have to give an expanded thanks (we don't have much room in the Endorsement area) for this detailed response. It's great to get some solid information about what it took to get a partial lifting of this penalty. It's certainly one I'll be sending other people to as an example of what to do.
-
**So Ryan in your opinion if they saw some major drops in rankings you would think it would be a safe bet that the site was penalized? **
Not necessarily. There are numerous issues which can cause ranking changes. A page could accidentally be blocked via "noindex" or robots.txt.
Diagnoses of a problem normally requires the highest level of skill. When you go to see a doctor with a problem and he or she can't figure out the cause of the problem, you are stuck....until another doctor comes up with the correct diagnosis. The pharmacy has all the right meds, but a diagnosis is required. The same holds true for SEO. When your business or health is on the line, you don't want to play guessing games.
-
In my opinion, whether Google chooses to index a page or not is not a consideration. You should remove all spammy links. Google could choose to reindex the page at any time and either way, they can still see the page with your link on it.
If anyone else has any solid information on this topic I would love to hear it. Otherwise I vote to play it safe, especially in a penalty situation.
-
Got another question for you. Do we even bother trying to get links from deindexed sites taken off or do you think Google takes those into account with the penalty?
-
So Ryan in your opinion if they saw some major drops in rankings you would think it would be a safe bet that the site was penalized?
They were also using Blog networks that got shut down, so those links have obviously been deindexed and therefore have no value which would drop the rankings anyway. That's the tricky part is the drop in rankings because the blog networks are gone or they are penalized.
-
Hi Ryan,
Great information.
We have had a tug of war with our SEO company who has built "unatural links". They claim it is impossible to do the job.
I wonder if you can explain your line ...if you build links on disposable sites which are not monitored, you clearly wont find help having them removed) and how the links were built." so that I can access how possible it is to get our bad links removed.
-
Thanks for the feedback Robert.
The main site to which I refer had a manual action placed in November 2011. Looking back, I would say it is was a prelude to Penguin. This site exceeded 99% of the links being manipulative so it is pretty clear any reasonable threshold would have been triggered.
What surprised me was how determined Google was about all the links being removed, and the level of documentation required. It is possible I simply received a hard-nosed Google employee but I really trust Google's manual team has a high degree of manual calibration in these cases. I think back to the leaked Google Panda notes and the tests to become a Google tester. They are extremely calibration focused. That's my two cents. It's just speculation but that would be my best guess.
-
Ryan,
This is impressive from the effort point of view alone; what sets it apart is your understanding of the need for documentation if you were to achieve success. So many sites had "SEO" firms do poor linking in the past and there was money to be made by just linking your junk to others. Unfortunately, many of these people went away or are of the type who would never take the time or energy to respond.
It would be interesting to know at what percentage of removal the Manual overseer will deem the site to be sufficiently rehabilitated on two levels:-
The first being the obvious that if a site can rehab to 35% for example the likelihood is google will lift the manual action.
-
The second being that, even at the example percentage of 35%, is it fair to the sites that did not go down that road that the "rehabilitated" site still has 65% of the inorganic links?
A question arises as to what caused the manual action?
Is the action taken as the result of some fixed ratio of organic to inorganic links?
Or, is it at least a varying percentage based on a given industry?
My guess is it is subjective on the part of those attempting to manually validate a huge piece of real estate.
Thanks for the excellent detail, you are truly a champ.
Robert
-
-
Wow great info Ryan. Is there a way to know for sure that a website has been penalized by Google and if this process needs to be started?
-
I have gained a lot of experience cleaning up spammy links over the past 6 months. This task is the most time consuming and unrewarding task in SEO. It is also necessary if your site has been manually penalized for "inorganic" links.
Does anyone have any suggestions on getting these removed?
I worked with Google on this topic for a client. My client's site was manually penalized specifically for "inorganic links". The client is an industry leader in their niche doing about $20 million in sales per year. They had millions of manipulative links pointed to their site from over 1000 linking root domains.
Step one: Google was absolutely firm in their expectation the links be removed prior to the penalty being lifted. This client had over 100 websites which were various forms of their main site (i.e. duplicate content). All the duplicate content sites were removed except the legitimate, human translated language variations of the site. We reported to Google these efforts which resulted in about 97% of the links being removed. Google responded that it was not enough and they required the links from the other external sites to be removed.
Step two of the process: we created an Excel spreadsheet to contact the sites giving priority to the sites with the most links. We tracked the following information: date of contact, initials of employee who performed contact, URL of domain, method of contact (e-mail / phone / contact us page), we provided a link to a copy of each e-mail we sent (see notes below), the date a response was received (if any), and a confirmation of whether the link was still visible.
Regarding the e-mails which were sent, they were very polite customized letters for each site. The letter format was as follows: introduction, description of the problem (i.e. our website has been penalized by Google...), the request to remove links, the location (URL) of all known links within their domain, and we thanked them for their efforts.
The results were we contacted hundreds of domains. The response rate was 14%. In this case, the company had these links built by another "SEO company" mostly between 2007 - 2009.
We reported our results to Google, shared the documentation and their response was:
"Thank you for your request and all of the follow up analysis. We've reviewed your case again, and unfortunately there are still many inorganic links pointing to the site. For example:..."
That led to step three: we went back to the original list of linking sites. For each and every site we covered four methods of contact: e-mail (if their address could be located), phone call (if a phone number could be located), contact us page (if the site offered one) and we looked up their WHOIS information and used that method of contact if the information was different then what was previously available.
Additionally, we went ahead and completed the list of contacting EVERY site who showed a link in Google WMT, even the hundreds of sites with only a single link. We knew our efforts would fail (14% rate of success) prior to starting so our focus was providing solid documentation. If you named a link we could present a copy of an e-mail request sent to remove the link, the date/time of when it was sent along with who sent it. That was the goal.
After submitting this final information to Google, they "partially" removed the manual penalty. The site seemed to rank normally but not as well as before. Google's response:
"Hello Ryan,
Thank you for your follow up email and all of the information provided. The documentation you provided was very helpful in processing and understanding this case.
After re-evaluating your site’s backlinks we are able to partially revoke a manual action. There are still inorganic links pointing to your site that we have taken action on. Once you’ve been able to make further progress in getting these links removed, feel free to reply to this email with the details of your clean-up effort"
Another client was also penalized but they had a single SEO company build most of their inorganic links. In this instance, the SEO company was able to remove almost all the links directly. They had control over many of the linking sites, they had retained their usernames / passwords to forums, etc. The success rate of link removal clearly depends on how long ago the links were built, how spammy the sites are (i.e. if you build links on disposable sites which are not monitored, you clearly wont find help having them removed) and how the links were built.
Good Luck,
-Ryan
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What is the best strategy for dissolving an innocently created link network with over 100 websites?
Hello Moz Community, Over many years 120 websites were created all under a couple different organizations around the globe. The sites are interconnected via anchor text and domain name links and some redirect to larger sites. The teachings have a central theme and many tools, training programs, events, locations and services are offered on many different websites. Attached is a slice of a Majestic Link Graph showing the network. God bless Majestic for this new tool! We are looking for solutions that are efficient and effective in regards to usability, rankings and being achievable. Thank you so much for your help! Donna EJhNPqT
White Hat / Black Hat SEO | | Awakening-Mind0 -
Inbound Linking
Hello, I manage a company that owns a bunch of schools (20) websites. They would like to create on each website a page which shows their schools in all the locations. Will this be ok as far as white hat practices and inbound linking?
White Hat / Black Hat SEO | | brightvessel0 -
Best practice to preserve the link juice to internal pages from expired domain?
This question relates to setting up an expired domain, that already has quality links, including deep links to internal pages. Since the new site structure will be different, what's the best practice to preserve the link juice to these internal pages? Export all the internal pages linked to using majestic Seo/ ahrefs etc, and set these pages previously linked to? Or 301 redirect these pages to home page? I heard there's a Wordpress plugin that 301 redirects all the 404 errors successfully preserving all the potential link juice.
White Hat / Black Hat SEO | | adorninvitations0 -
Do Wikipedia links add value?
Do Wikipedia pages/links add any value to your website and SEO? We are not an advertiser or seller of products, whereas we help people with planning so say I add an external link from an established page relevant to our service, will we get penalised by Wikipedia? Or is it worth setting up a page about our company, similar to say - the BBC with an external link? Thanks!
White Hat / Black Hat SEO | | Jaybeamer0 -
Where Is This Link Coming From?
According to Moz Analytics we have a link coming from here: http://www.grayshadowfinancial.com/ The anchor text is earthquake prone map. I can't find the link, but if I cntrl+f "earthquake prone map" it shows in the find box, even though I can't see it. I'm guessing this is some spam tactic and they are hiding this with their CSS? Is there anyway I could see it? Best, Ruben
White Hat / Black Hat SEO | | KempRugeLawGroup0 -
Advice on links after Penguin hit
Firstly we have no warnings or messages in WMT. We have racked up thousands of anchor text urls. Our fault, we didnt nofollow and also some of our many cms sites replicated the links sitewide to the tune of 20,000 links. I`m in the process of removing the code which causes this problem in most of the culprit sites but how long will it take roughly for a crawl to recalculate the links? In my WMT it still shows the links increasing but I think this is retrospective data. However, after this crawl we should see a more relevant link count. We also provide some web software which has been used by many sites. Google may consider our followed anchor text violating spam rules. So I ask, if we were to change the link text to our url only and add nofollow, will this improve the spam issue? We could have as many as 4,000 links per website, as it is a calendar function and list all dates into the future.......and we would like to retain a link to our website of course for marketing purposes. What we dont want is sitewide link spam again. Some of our other links are low quality, some are okay. However, we have lost rankings, probably due to low quality links and overuse of anchor text.. Is this the case the Google has just devalued the links algorythmically or is there an actual penalty to make the rankings drop? As we have no warnings in WMT, I feel there isnt the need to remove the lower quality links and in most cases we havent control over the link placements. We should just rectify that we have a better future linking profile? If we have to remove spam links, then that can only be a good reason to cause negative seo?
White Hat / Black Hat SEO | | xtopher660 -
Secretly back-linking from whitelabel product
Lets say a company (provider.com) offers a whitelabel solution which enables each client to have all of the content on their own domain (product.client.com), with no branding by the content provider. Now lets say that client.com is a site with a lot of authority, and to promote the launch of product.client.com, they put a lot of links from their main site to the subdomain. This can be very valuable link juice, and provider.com would like to be able to take advantage. The problem is, that client.com wouldn't like it if provider.com put in links on their whitelabel site. Suppose the following: All pages on product.client.com start to have a rel="canonical" link to themselves, with a get variable (e.g. product.client.com/page.htm -> product.client.com/page.html?show_extra_link=true) When the page is visited with the extra get parameter "show_extra_link" a link appears in the footer that points to provider.com My question is, would this have the same effect for provider.com as placing a link on the non-canonical version of the pages on the whitelabel site would?
White Hat / Black Hat SEO | | seoczar0 -
Problems with link spam from spam blogs to competitor sites
A competitor of ours is having a great deal of success with links from spam blogs (such as: publicexperience.com or sexylizard.org) it is proving to be a nightmare. Google does not detect these (the competitor has been doing well now for over a year) and my boss is starting to think if you can’t beat them, join them. Frankly, he is right – we have built some great links but it is nigh on impossible to beat 400+ highly targeted spam links in a niche market. My question is, has anyone had success in getting this sort of stuff brought to the attention of Google and banned (I actually listed them all in a message in webmaster tools and sent them over to Google over a year ago!). This is frustrating, I do not want to join in this kind of rubbish but it is hard to put a convincing argument against it when our competitor has used the technique successfully for over a year without any penalty. Ideas? Thoughts? All help appreciated
White Hat / Black Hat SEO | | RodneyRiley0