Does anyone have any suggestions on removing spammy links?
-
My plan is to put all the root domains into http://netpeak.net/software/netpeak-checker/ check for PR main, status code, index, PA and DA. Then put them into Buzzstream which should go out and find the info for you. Then grab all the links from each spammy domain and provide them in the email to the webmaster to make them easier to remove. Hopefully this will make it a little more efficient.
-
I'm just using the free bit myself.
Its pretty new, but seems to work well enough. It may well pull some wrong info (or maybe pulls the info it gets to first)
- for the PR, does it always show the home page PR? Or does it calculate the PR for other pages by subtracting 1 for every click from the home page? I mainly ask so I can respond to client questions if they ever see the tool.
I doubt its that clever, its just aggregating data
S
-
Thanks for sharing this tool Stephen. I watched the video but the site does not share any info about the mechanics of the tool. Some questions:
-
how is the contact info pulled? I am wondering if it sometimes misses info or pulls the wrong info
-
for the PR, does it always show the home page PR? Or does it calculate the PR for other pages by subtracting 1 for every click from the home page? I mainly ask so I can respond to client questions if they ever see the tool.
-
any idea of what Agency pricing is?
I am just asking in case you happen to know some of this info. Otherwise I will reach out to the author.
Thanks again Stephen!
-
-
I've been using http://www.outreachr.com/bulk-domain-checker/ to pull data out of batches of urls for this. It goes and grabs link data from SEOmoz and then has a go at getting contact details including twitter etc
(Hope I don't kill his server while hes on holiday by posting this here)
-
Yes.
In the first case I shared, the client actually performed all the website contacts. I offered guidance on what was required and the client ran with it.
If my team was going to perform the work, I would request a mailbox be set up on the client's domain which we could use for this process.
-
Ryan are you using the client's email address? Seems it may get a better response rate
-
I wouldn't bother doing anything based on PR, would chase all backlinks that may appear in-organic.
-
We are left working with educated guesses. I would recommend a cleanup of spammy links for any client. If the client is currently not penalized, my judgment would focus only on sites listed in WMT which also have over 100 links pointing to the site.
Once the links have been cleaned up, I would check all client sites again after 30 days. Any client who exceeds 90% spam links clearly required further effort. No one knows where the threshold lies, but it's a pretty good guess that if 90% of your links are spammy you are not in a good place.
-
Thanks Ryan you've given me a lot to work with. Hell if I get good at this I might just create a whole new service for my agency lol.
Oh one more question and then I'll leave you alone. What about sites that haven't been hit yet, but have used similar tactics? Would you start this process for them? Or cross fingers?
-
Your process seems sound. A bit of additional feedback:
-
I would complete a Reconsideration Request but then proceed without delay to removing the links. You know the site has spammy links and should be removed.
-
I have no familiarity with Netpeak Checker but I'll take a look at the tool. Otherwise I cannot comment on it.
-
The "resubmit to Google" is not necessary. If they confirm the site has been manually penalized, they are seeking for you to remove all the spammy links. I have talked with others in this situation and Google is quite firm on their desire for you to address 100% of the problem. I would not bother submitting another Reconsideration Request until you have either removed all the manipulative links, or can show solid documentation of your efforts to do so.
Good Luck.
-
-
Yeah this all came right around "Penguin" so I'm fairly certain it's related. They do have a lot of exact anchor text too, but for a wide variety of terms. They were also using blog networks, and have spammy links, so it's really hard to pinpoint which of these or if all of them are the problem.
At any rate should this be my process?
- Resubmit to Google
- See if they answer back and with what
- If no answer proceed with removal
- Get links from webmaster tools
- Parse out Root linking Domains
- Run through Netpeak Checker (awesome tool if you haven't used it) finds PR, SEOmoz stats, Google index, status code, etc.
- First remove all PR 0 and live pages
- Resubmit to Google
- Second remove all deindexed PR 0
- Resubmit to Google
- Get other link source data (Majestic SEO, Opensite Explorer)
- Remove PR 0 links
- Resubmit to Google
Hopefully that will do it. What do you think of this process? Oh and Thank you very much for your help You're awesome.
-
You can complete a Reconsideration Request. In the initial case, Google confirmed there was manual action taken. After the 100+ duplicate sites were taken down, Google then confirmed the remaining issue was due to the manipulative links.
With the recent Penguin update, Google may have automated part of this process into their algorithm.
-
Wow! I just have to give an expanded thanks (we don't have much room in the Endorsement area) for this detailed response. It's great to get some solid information about what it took to get a partial lifting of this penalty. It's certainly one I'll be sending other people to as an example of what to do.
-
**So Ryan in your opinion if they saw some major drops in rankings you would think it would be a safe bet that the site was penalized? **
Not necessarily. There are numerous issues which can cause ranking changes. A page could accidentally be blocked via "noindex" or robots.txt.
Diagnoses of a problem normally requires the highest level of skill. When you go to see a doctor with a problem and he or she can't figure out the cause of the problem, you are stuck....until another doctor comes up with the correct diagnosis. The pharmacy has all the right meds, but a diagnosis is required. The same holds true for SEO. When your business or health is on the line, you don't want to play guessing games.
-
In my opinion, whether Google chooses to index a page or not is not a consideration. You should remove all spammy links. Google could choose to reindex the page at any time and either way, they can still see the page with your link on it.
If anyone else has any solid information on this topic I would love to hear it. Otherwise I vote to play it safe, especially in a penalty situation.
-
Got another question for you. Do we even bother trying to get links from deindexed sites taken off or do you think Google takes those into account with the penalty?
-
So Ryan in your opinion if they saw some major drops in rankings you would think it would be a safe bet that the site was penalized?
They were also using Blog networks that got shut down, so those links have obviously been deindexed and therefore have no value which would drop the rankings anyway. That's the tricky part is the drop in rankings because the blog networks are gone or they are penalized.
-
Hi Ryan,
Great information.
We have had a tug of war with our SEO company who has built "unatural links". They claim it is impossible to do the job.
I wonder if you can explain your line ...if you build links on disposable sites which are not monitored, you clearly wont find help having them removed) and how the links were built." so that I can access how possible it is to get our bad links removed.
-
Thanks for the feedback Robert.
The main site to which I refer had a manual action placed in November 2011. Looking back, I would say it is was a prelude to Penguin. This site exceeded 99% of the links being manipulative so it is pretty clear any reasonable threshold would have been triggered.
What surprised me was how determined Google was about all the links being removed, and the level of documentation required. It is possible I simply received a hard-nosed Google employee but I really trust Google's manual team has a high degree of manual calibration in these cases. I think back to the leaked Google Panda notes and the tests to become a Google tester. They are extremely calibration focused. That's my two cents. It's just speculation but that would be my best guess.
-
Ryan,
This is impressive from the effort point of view alone; what sets it apart is your understanding of the need for documentation if you were to achieve success. So many sites had "SEO" firms do poor linking in the past and there was money to be made by just linking your junk to others. Unfortunately, many of these people went away or are of the type who would never take the time or energy to respond.
It would be interesting to know at what percentage of removal the Manual overseer will deem the site to be sufficiently rehabilitated on two levels:-
The first being the obvious that if a site can rehab to 35% for example the likelihood is google will lift the manual action.
-
The second being that, even at the example percentage of 35%, is it fair to the sites that did not go down that road that the "rehabilitated" site still has 65% of the inorganic links?
A question arises as to what caused the manual action?
Is the action taken as the result of some fixed ratio of organic to inorganic links?
Or, is it at least a varying percentage based on a given industry?
My guess is it is subjective on the part of those attempting to manually validate a huge piece of real estate.
Thanks for the excellent detail, you are truly a champ.
Robert
-
-
Wow great info Ryan. Is there a way to know for sure that a website has been penalized by Google and if this process needs to be started?
-
I have gained a lot of experience cleaning up spammy links over the past 6 months. This task is the most time consuming and unrewarding task in SEO. It is also necessary if your site has been manually penalized for "inorganic" links.
Does anyone have any suggestions on getting these removed?
I worked with Google on this topic for a client. My client's site was manually penalized specifically for "inorganic links". The client is an industry leader in their niche doing about $20 million in sales per year. They had millions of manipulative links pointed to their site from over 1000 linking root domains.
Step one: Google was absolutely firm in their expectation the links be removed prior to the penalty being lifted. This client had over 100 websites which were various forms of their main site (i.e. duplicate content). All the duplicate content sites were removed except the legitimate, human translated language variations of the site. We reported to Google these efforts which resulted in about 97% of the links being removed. Google responded that it was not enough and they required the links from the other external sites to be removed.
Step two of the process: we created an Excel spreadsheet to contact the sites giving priority to the sites with the most links. We tracked the following information: date of contact, initials of employee who performed contact, URL of domain, method of contact (e-mail / phone / contact us page), we provided a link to a copy of each e-mail we sent (see notes below), the date a response was received (if any), and a confirmation of whether the link was still visible.
Regarding the e-mails which were sent, they were very polite customized letters for each site. The letter format was as follows: introduction, description of the problem (i.e. our website has been penalized by Google...), the request to remove links, the location (URL) of all known links within their domain, and we thanked them for their efforts.
The results were we contacted hundreds of domains. The response rate was 14%. In this case, the company had these links built by another "SEO company" mostly between 2007 - 2009.
We reported our results to Google, shared the documentation and their response was:
"Thank you for your request and all of the follow up analysis. We've reviewed your case again, and unfortunately there are still many inorganic links pointing to the site. For example:..."
That led to step three: we went back to the original list of linking sites. For each and every site we covered four methods of contact: e-mail (if their address could be located), phone call (if a phone number could be located), contact us page (if the site offered one) and we looked up their WHOIS information and used that method of contact if the information was different then what was previously available.
Additionally, we went ahead and completed the list of contacting EVERY site who showed a link in Google WMT, even the hundreds of sites with only a single link. We knew our efforts would fail (14% rate of success) prior to starting so our focus was providing solid documentation. If you named a link we could present a copy of an e-mail request sent to remove the link, the date/time of when it was sent along with who sent it. That was the goal.
After submitting this final information to Google, they "partially" removed the manual penalty. The site seemed to rank normally but not as well as before. Google's response:
"Hello Ryan,
Thank you for your follow up email and all of the information provided. The documentation you provided was very helpful in processing and understanding this case.
After re-evaluating your site’s backlinks we are able to partially revoke a manual action. There are still inorganic links pointing to your site that we have taken action on. Once you’ve been able to make further progress in getting these links removed, feel free to reply to this email with the details of your clean-up effort"
Another client was also penalized but they had a single SEO company build most of their inorganic links. In this instance, the SEO company was able to remove almost all the links directly. They had control over many of the linking sites, they had retained their usernames / passwords to forums, etc. The success rate of link removal clearly depends on how long ago the links were built, how spammy the sites are (i.e. if you build links on disposable sites which are not monitored, you clearly wont find help having them removed) and how the links were built.
Good Luck,
-Ryan
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Link Getting Deleted for Few Days
If a link gets deleted for few days and re-appears... Will Google treat it as a "new link" or give it the same old link-age.
White Hat / Black Hat SEO | | Akshayshr0 -
Regular links may still
Good day: I understand guest articles are a good way to pass linkjuice and some authors have a link to their website on the "Author Bio" section of the article. These links are usually regular links. However, I noticed that some of these sites (using wordpress) have several SEO plugins with the following settings: Nofollow: Tell search engines not to spider links on this webpage. My question is: If the setting above was activated, I would assume the author's website link would look like a regular link but some other code could still be present in the page (ex, header) that would prevent this regular link from being followed. Therefore, the guest writer would not experience any linkjuice. Any idea if there's a way of being able to see if this scenario is happening? What code would we look for?
White Hat / Black Hat SEO | | Audreythenurse0 -
Why Link Spamming Website Coming on First Page Google?
As we all already know about link spamming. As per Google Guidelines Link building, Exact Keywords Anchor Link Building is dead now but i am looking most of the website coming on first page in Google doing same exact keywords linking. I think directory, article, social bookmarking, press release and other link building activity is also dead now. Matt always saying content is more important but if we will not put any keywords link in content part then how website rank in first page in Google. Can anybody explain why is website coming on first page because when i am doing same activity for quality links with higher domain authority website then we are affected in Google update.
White Hat / Black Hat SEO | | dotlineseo0 -
Link Audit: How do I decide what is a good or bad link?
I am conducting a link audit for one of my formerly high-ranking pages. But despite reading quite a bit on the issue, I am still quite confused as to how to decide whether to keep or remove a link. Some links come from directories and social bookmarking sites. I know that generally speaking, you do not want to be on these types of sites, but what if their domain authorities, pageranks, and mozTrusts scores are good? For example, here is one of my links for "envelopes": http://www.folkd.com/detail/www.jampaper.com%2FEnvelopes The page itself has no MozRank, MozTrust, or links but the domain has an authority of 88, a MozRank of 6.41, a mozTrust of 6.31. Should I be looking on a page level or domain level basis? It also has over 5 million links, with over two million of those being external followed links. Is the high quantity of links a warning sign? I also used a free online tool (thesitevalue.com) to determine how much traffic the domain gets. Apparently it receives over 350,000 unique visits daily, so it must be useful to people. This, combined with the fact that we've received 5 visits from the link over the last year (not a lot, but something), makes me believe that the link's intent wasn't purely to "trick" Google. Despite this, I still have a feeling the link could be considered low-quality based on the domain's appearance. Similarly, some of our links are coming from domains named linkdirect.info, backlinks8.com, tolinkup.com, findyourlink.info, searchengineurl.com, websubmissionfree.com. Is it safe to assume these are harmful links strictly because of their names? Thank you!
White Hat / Black Hat SEO | | jampaper0 -
White hat link technique to banned domain
The question is: I have branddomain A (manually penalization by google, one year ago and after 4 consideration requests and more than 3/4 of links removed, stills banned) authority 42 And and new branddomain B (with fresh content created after penalization in the case of no recovery as it happen) authority 26 There are no links from A to B, both are now with same traffic but i want people that find me on domain A (partial penalized) to come to my new web brand. Both domains have same name, different extensión. So the question: Can i link with photo domain A to domain B, if i place nofollow and no ancor text on those linked photos. I want to have my traffic unified and i dont want to go against google guidelines
White Hat / Black Hat SEO | | maestrosonrisas0 -
Links from same brands?
Hi, We have around 25 sites around the world under different domains and brand names. I was wondering if its safe or not to interconnect (linking) all these sites. Please let me know, Thanks!
White Hat / Black Hat SEO | | ferratum0 -
Advice on using the disavow tool to remove hacked website links
Hey Everyone, Back in December, our website suffered an attack which created links to other hacked webistes which anchor text such as "This is an excellent time to discuss symptoms, fa" "Open to members of the nursing/paramedical profes" "The organs in the female reproductive system incl" The links were only visible when looking at the Cache of the page. We got these links removed and removed all traces of the attack such as pages which were created in their own directory on our server 3 months later I'm finding websites linking to us with similar anchor text to the ones above, however they're linking to the pages that were created on our server when we were attacked and they've been removed. So one of my questions is does this effect our site? We've seen some of our best performing keywords drop over the last few months and I have a feeling it's due to these spammy links. Here's a website that links to us <colgroup><col width="751"></colgroup>
White Hat / Black Hat SEO | | blagger
| http://www.fashion-game.com/extreme/blog/page-9 | If you do view source or look at the cached version then you'll find a link right at the bottom left corner. We have 268 of these links from 200 domains. Contacting these sites to have these links removed would be a very long process as most of them probably have no idea that those links even exist and I don't have the time to explain to each one how to remove the hacked files etc. I've been looking at using the Google Disavow tool to solve this problem but I'm not sure if it's a good idea or not. We haven't had any warnings from Google about our site being spam or having too many spam links, so do we need to use the tool? Any advice would be very much appreciated. Let me know if you require more details about our problem. <colgroup><col width="355"></colgroup>
| | | |0 -
Link Building with links in footer of Word Press Themes- Is This BLack Hat??
I See lots of free word press themes with links in footer like Kids Headphones | Colombia Classifieds | Broadway Tickets Is this a valid white hat link building method? What if the theme looked like a particular industry and the links related to the industry would that be better?
White Hat / Black Hat SEO | | DavidKonigsberg0