Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Does anyone have any suggestions on removing spammy links?
-
My plan is to put all the root domains into http://netpeak.net/software/netpeak-checker/ check for PR main, status code, index, PA and DA. Then put them into Buzzstream which should go out and find the info for you. Then grab all the links from each spammy domain and provide them in the email to the webmaster to make them easier to remove. Hopefully this will make it a little more efficient.
-
I'm just using the free bit myself.
Its pretty new, but seems to work well enough. It may well pull some wrong info (or maybe pulls the info it gets to first)
- for the PR, does it always show the home page PR? Or does it calculate the PR for other pages by subtracting 1 for every click from the home page? I mainly ask so I can respond to client questions if they ever see the tool.
I doubt its that clever, its just aggregating data
S
-
Thanks for sharing this tool Stephen. I watched the video but the site does not share any info about the mechanics of the tool. Some questions:
-
how is the contact info pulled? I am wondering if it sometimes misses info or pulls the wrong info
-
for the PR, does it always show the home page PR? Or does it calculate the PR for other pages by subtracting 1 for every click from the home page? I mainly ask so I can respond to client questions if they ever see the tool.
-
any idea of what Agency pricing is?
I am just asking in case you happen to know some of this info. Otherwise I will reach out to the author.
Thanks again Stephen!
-
-
I've been using http://www.outreachr.com/bulk-domain-checker/ to pull data out of batches of urls for this. It goes and grabs link data from SEOmoz and then has a go at getting contact details including twitter etc
(Hope I don't kill his server while hes on holiday by posting this here)
-
Yes.
In the first case I shared, the client actually performed all the website contacts. I offered guidance on what was required and the client ran with it.
If my team was going to perform the work, I would request a mailbox be set up on the client's domain which we could use for this process.
-
Ryan are you using the client's email address? Seems it may get a better response rate
-
I wouldn't bother doing anything based on PR, would chase all backlinks that may appear in-organic.
-
We are left working with educated guesses. I would recommend a cleanup of spammy links for any client. If the client is currently not penalized, my judgment would focus only on sites listed in WMT which also have over 100 links pointing to the site.
Once the links have been cleaned up, I would check all client sites again after 30 days. Any client who exceeds 90% spam links clearly required further effort. No one knows where the threshold lies, but it's a pretty good guess that if 90% of your links are spammy you are not in a good place.
-
Thanks Ryan you've given me a lot to work with. Hell if I get good at this I might just create a whole new service for my agency lol.
Oh one more question and then I'll leave you alone. What about sites that haven't been hit yet, but have used similar tactics? Would you start this process for them? Or cross fingers?
-
Your process seems sound. A bit of additional feedback:
-
I would complete a Reconsideration Request but then proceed without delay to removing the links. You know the site has spammy links and should be removed.
-
I have no familiarity with Netpeak Checker but I'll take a look at the tool. Otherwise I cannot comment on it.
-
The "resubmit to Google" is not necessary. If they confirm the site has been manually penalized, they are seeking for you to remove all the spammy links. I have talked with others in this situation and Google is quite firm on their desire for you to address 100% of the problem. I would not bother submitting another Reconsideration Request until you have either removed all the manipulative links, or can show solid documentation of your efforts to do so.
Good Luck.
-
-
Yeah this all came right around "Penguin" so I'm fairly certain it's related. They do have a lot of exact anchor text too, but for a wide variety of terms. They were also using blog networks, and have spammy links, so it's really hard to pinpoint which of these or if all of them are the problem.
At any rate should this be my process?
- Resubmit to Google
- See if they answer back and with what
- If no answer proceed with removal
- Get links from webmaster tools
- Parse out Root linking Domains
- Run through Netpeak Checker (awesome tool if you haven't used it) finds PR, SEOmoz stats, Google index, status code, etc.
- First remove all PR 0 and live pages
- Resubmit to Google
- Second remove all deindexed PR 0
- Resubmit to Google
- Get other link source data (Majestic SEO, Opensite Explorer)
- Remove PR 0 links
- Resubmit to Google
Hopefully that will do it. What do you think of this process? Oh and Thank you very much for your help You're awesome.
-
You can complete a Reconsideration Request. In the initial case, Google confirmed there was manual action taken. After the 100+ duplicate sites were taken down, Google then confirmed the remaining issue was due to the manipulative links.
With the recent Penguin update, Google may have automated part of this process into their algorithm.
-
Wow! I just have to give an expanded thanks (we don't have much room in the Endorsement area) for this detailed response. It's great to get some solid information about what it took to get a partial lifting of this penalty. It's certainly one I'll be sending other people to as an example of what to do.
-
**So Ryan in your opinion if they saw some major drops in rankings you would think it would be a safe bet that the site was penalized? **
Not necessarily. There are numerous issues which can cause ranking changes. A page could accidentally be blocked via "noindex" or robots.txt.
Diagnoses of a problem normally requires the highest level of skill. When you go to see a doctor with a problem and he or she can't figure out the cause of the problem, you are stuck....until another doctor comes up with the correct diagnosis. The pharmacy has all the right meds, but a diagnosis is required. The same holds true for SEO. When your business or health is on the line, you don't want to play guessing games.
-
In my opinion, whether Google chooses to index a page or not is not a consideration. You should remove all spammy links. Google could choose to reindex the page at any time and either way, they can still see the page with your link on it.
If anyone else has any solid information on this topic I would love to hear it. Otherwise I vote to play it safe, especially in a penalty situation.
-
Got another question for you. Do we even bother trying to get links from deindexed sites taken off or do you think Google takes those into account with the penalty?
-
So Ryan in your opinion if they saw some major drops in rankings you would think it would be a safe bet that the site was penalized?
They were also using Blog networks that got shut down, so those links have obviously been deindexed and therefore have no value which would drop the rankings anyway. That's the tricky part is the drop in rankings because the blog networks are gone or they are penalized.
-
Hi Ryan,
Great information.
We have had a tug of war with our SEO company who has built "unatural links". They claim it is impossible to do the job.
I wonder if you can explain your line ...if you build links on disposable sites which are not monitored, you clearly wont find help having them removed) and how the links were built." so that I can access how possible it is to get our bad links removed.
-
Thanks for the feedback Robert.
The main site to which I refer had a manual action placed in November 2011. Looking back, I would say it is was a prelude to Penguin. This site exceeded 99% of the links being manipulative so it is pretty clear any reasonable threshold would have been triggered.
What surprised me was how determined Google was about all the links being removed, and the level of documentation required. It is possible I simply received a hard-nosed Google employee but I really trust Google's manual team has a high degree of manual calibration in these cases. I think back to the leaked Google Panda notes and the tests to become a Google tester. They are extremely calibration focused. That's my two cents. It's just speculation but that would be my best guess.
-
Ryan,
This is impressive from the effort point of view alone; what sets it apart is your understanding of the need for documentation if you were to achieve success. So many sites had "SEO" firms do poor linking in the past and there was money to be made by just linking your junk to others. Unfortunately, many of these people went away or are of the type who would never take the time or energy to respond.
It would be interesting to know at what percentage of removal the Manual overseer will deem the site to be sufficiently rehabilitated on two levels:-
The first being the obvious that if a site can rehab to 35% for example the likelihood is google will lift the manual action.
-
The second being that, even at the example percentage of 35%, is it fair to the sites that did not go down that road that the "rehabilitated" site still has 65% of the inorganic links?
A question arises as to what caused the manual action?
Is the action taken as the result of some fixed ratio of organic to inorganic links?
Or, is it at least a varying percentage based on a given industry?
My guess is it is subjective on the part of those attempting to manually validate a huge piece of real estate.
Thanks for the excellent detail, you are truly a champ.
Robert
-
-
Wow great info Ryan. Is there a way to know for sure that a website has been penalized by Google and if this process needs to be started?
-
I have gained a lot of experience cleaning up spammy links over the past 6 months. This task is the most time consuming and unrewarding task in SEO. It is also necessary if your site has been manually penalized for "inorganic" links.
Does anyone have any suggestions on getting these removed?
I worked with Google on this topic for a client. My client's site was manually penalized specifically for "inorganic links". The client is an industry leader in their niche doing about $20 million in sales per year. They had millions of manipulative links pointed to their site from over 1000 linking root domains.
Step one: Google was absolutely firm in their expectation the links be removed prior to the penalty being lifted. This client had over 100 websites which were various forms of their main site (i.e. duplicate content). All the duplicate content sites were removed except the legitimate, human translated language variations of the site. We reported to Google these efforts which resulted in about 97% of the links being removed. Google responded that it was not enough and they required the links from the other external sites to be removed.
Step two of the process: we created an Excel spreadsheet to contact the sites giving priority to the sites with the most links. We tracked the following information: date of contact, initials of employee who performed contact, URL of domain, method of contact (e-mail / phone / contact us page), we provided a link to a copy of each e-mail we sent (see notes below), the date a response was received (if any), and a confirmation of whether the link was still visible.
Regarding the e-mails which were sent, they were very polite customized letters for each site. The letter format was as follows: introduction, description of the problem (i.e. our website has been penalized by Google...), the request to remove links, the location (URL) of all known links within their domain, and we thanked them for their efforts.
The results were we contacted hundreds of domains. The response rate was 14%. In this case, the company had these links built by another "SEO company" mostly between 2007 - 2009.
We reported our results to Google, shared the documentation and their response was:
"Thank you for your request and all of the follow up analysis. We've reviewed your case again, and unfortunately there are still many inorganic links pointing to the site. For example:..."
That led to step three: we went back to the original list of linking sites. For each and every site we covered four methods of contact: e-mail (if their address could be located), phone call (if a phone number could be located), contact us page (if the site offered one) and we looked up their WHOIS information and used that method of contact if the information was different then what was previously available.
Additionally, we went ahead and completed the list of contacting EVERY site who showed a link in Google WMT, even the hundreds of sites with only a single link. We knew our efforts would fail (14% rate of success) prior to starting so our focus was providing solid documentation. If you named a link we could present a copy of an e-mail request sent to remove the link, the date/time of when it was sent along with who sent it. That was the goal.
After submitting this final information to Google, they "partially" removed the manual penalty. The site seemed to rank normally but not as well as before. Google's response:
"Hello Ryan,
Thank you for your follow up email and all of the information provided. The documentation you provided was very helpful in processing and understanding this case.
After re-evaluating your site’s backlinks we are able to partially revoke a manual action. There are still inorganic links pointing to your site that we have taken action on. Once you’ve been able to make further progress in getting these links removed, feel free to reply to this email with the details of your clean-up effort"
Another client was also penalized but they had a single SEO company build most of their inorganic links. In this instance, the SEO company was able to remove almost all the links directly. They had control over many of the linking sites, they had retained their usernames / passwords to forums, etc. The success rate of link removal clearly depends on how long ago the links were built, how spammy the sites are (i.e. if you build links on disposable sites which are not monitored, you clearly wont find help having them removed) and how the links were built.
Good Luck,
-Ryan
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Do links from subdomains pass the authority and link juice of main domain ?
Hi, There is a subdomain with a root domain's DA 90. I can earn a backlink from that subdomain. This subdomain is fresh with no traffic yet. Do I get the ranking boost and authority from the subdomain? Example: I can earn a do-follow link from **https://what-is-crm.netlify.app/ **but not from https://netlify.app
White Hat / Black Hat SEO | | teamtc0 -
Should You Link Back from Client's Website?
We had a discussion in the office today, about if it can help or hurt you to link back to your site from one that you optimize, host, or manage. A few ideas that were mentioned: HURT:
White Hat / Black Hat SEO | | David-Kley
1. The website is not directly related to your niche, therefore Google will treat it as a link exchange or spammy link.
2. Links back to you are often not surrounded by related text about your services, and looks out of place to users and Search Engines. HELP:
1. On good (higher PR, reputable domain) domains, a link back can add authority, even if the site is not directly related to your services.
2. Allows high ranking sites to show users who the provider is, potentially creating a new client, and a followed incoming link on anchor text you can choose. So, what do you think? Test results would be appreciated, as we are trying to get real data. Benefits and cons if you have an opinion.2 -
Should I Do a Social Bookmarking Campaign and a Tier 2 Linking?
I don't see anything bad in manually creating links on different (about 50) social bookmarking services. Is this method labeled as White Hat? I was wondering if it would be fine to create Tier 2 linking (probably blog comments) for indexing of the social bookmarking links? Please share your thoughts on the topic.
White Hat / Black Hat SEO | | zorsto0 -
Disavow - Broken links
I have a client who dealt with an SEO that created not great links for their site. http://www.golfamigos.co.uk/ When I drilled down in opensiteexplorer there are quite a few links where the sites do not exist anymore - so I thought I could test out Disavow out on them .. maybe just about 6 - then we are building good quality links to try and tackle this problem with a more positive approach. I just wondered what the consensus was?
White Hat / Black Hat SEO | | lauratagdigital0 -
Off-page SEO and link building
Hi everyone! I work for a marketing company; for one of our clients' sites, we are working with an independent SEO consultant for on-page help (it's a large site) as well as off-page SEO. Following a meeting with the consultant, I had a few red flags with his off-page practices – however, I'm not sure if I'm just inexperienced and this is just "how it works" or if we should shy away from these methods. He plans to: guest blog do press release marketing comment on blogs He does not plan to consult with us in advance regarding the content that is produced, or where it is posted. In addition, he doesn't plan on producing a report of what was posted where. When I asked about these things, he told me they haven't encountered any problems before. I'm not saying it was spam-my, but I'm more not sure if these methods are leaning in the direction of "growing out of date," or the direction of "black-hat, run away, dude." Any thoughts on this would be crazy appreciated! Thanks, Casey
White Hat / Black Hat SEO | | CaseyDaline0 -
Finding and Removing bad backlinks
Ok here goes. Over the past 2 years our traffic and rankings have slowly declined, most importantly, for keywords that we ranked #1 and #2 at for years. With the new Penguin updates this year, we never saw a huge drop but a constant slow loss. My boss has tasked me with cleaning up our bad links and reshaping our link profile so that it is cleaner and more natural. I currently have access to Google Analytics and Webmaster Tools, SEOMoz, and Link Builder. 1)What is the best program or process for identifying bad backlinks? What exactly am I looking for? Too many links from one domain? Links from Low PR or low “Trust URL” sites? I have gotten conflicting information reading about all this on the net, with some saying that too many good links(high PR) can be unnatural without some lower level PR links, so I just want to make sure that I am not asking for links to be removed that we need to create or maintain our link profile. 2)What is the best program or process for viewing our link profile and what exactly am I looking for? What constitutes a healthy link profile after the new google algorithm updates? What is the best way to change it? 3)Where do I start with this task? Remove spammy links first or figure out or profile first and then go after bad links? 4)We have some backlinks that are to our old .aspx that we moved to our new platform 2 years ago, there are quite a few (1000+). Some of these pages were redirected and some the redirects were broken at some point. Is there any residual juice in these backlinks still? Should we fix the broken redirects, or does it do nothing? My boss says the redirects wont do anything now that google no longer indexes the old pages but other people have said differently. Whats the deal should we still fix the redirects even though the pages are no longer indexed? I really appreciate any advice as basically if we cant get our site and sales turned around, my job is at stake. Our site is www.k9electronics.com if you want to take a look. We just moved hosts so there are some redirect issues and other things going on we know about.
White Hat / Black Hat SEO | | k9byron0 -
Deny visitors by referrer in .htaccess to clean up spammy links?
I want to lead off by saying that I do not recommend trying this. My gut tells me that this is a bad idea, but I want to start a conversation about why. Since penguin a few weeks ago, one of the most common topics of conversation in almost every SEO/Webmaster forum is "how to remove spammy links". As Ryan Kent pointed out, it is almost impossible to remove all of these links, as these webmasters and previous link builders rarely respond. This is particularly concerning given that he also points out that Google is very adamant that ALL of these links are removed. After a handful of sleepless nights and some research, I found out that you can block traffic from specific referring sites using your.htaccess file. My thinking is that by blocking traffic from the domains with the spammy links, you could prevent Google from crawling from those sites to yours, thus indicating that you do not want to take credit for the link. I think there are two parts to the conversation... Would this work? Google would still see the link on the offending domain, but by blocking that domain are you preventing any strength or penalty associated with that domain from impacting your site? If for whatever reason this would nto work, would a tweak in the algorithm by Google to allow this practice be beneficial to both Google and the SEO community? This would certainly save those of us tasked with cleaning up previous work by shoddy link builders a lot of time and allow us to focus on what Google wants in creating high quality sites. Thoughts?
White Hat / Black Hat SEO | | highlyrelevant0 -
Partners and Customers logo listing and links
We have just created a program where we list the customers that use our software and a link to their websites on a new "Customers" page. We expect to have upwards of 100 logos with links back to their sites. I want to be sure this isn't bordering on gray or black hat link building. I think it is okay since they are actual users of our software. But there is still that slight doubt. Along these same lines, would you recommend adding a nofollow or noindex tag? Thanks for your help.
White Hat / Black Hat SEO | | PerriCline0