Deny visitors by referrer in .htaccess to clean up spammy links?
-
I want to lead off by saying that I do not recommend trying this. My gut tells me that this is a bad idea, but I want to start a conversation about why.
Since penguin a few weeks ago, one of the most common topics of conversation in almost every SEO/Webmaster forum is "how to remove spammy links". As Ryan Kent pointed out, it is almost impossible to remove all of these links, as these webmasters and previous link builders rarely respond. This is particularly concerning given that he also points out that Google is very adamant that ALL of these links are removed.
After a handful of sleepless nights and some research, I found out that you can block traffic from specific referring sites using your.htaccess file. My thinking is that by blocking traffic from the domains with the spammy links, you could prevent Google from crawling from those sites to yours, thus indicating that you do not want to take credit for the link.
I think there are two parts to the conversation...
-
Would this work? Google would still see the link on the offending domain, but by blocking that domain are you preventing any strength or penalty associated with that domain from impacting your site?
-
If for whatever reason this would nto work, would a tweak in the algorithm by Google to allow this practice be beneficial to both Google and the SEO community? This would certainly save those of us tasked with cleaning up previous work by shoddy link builders a lot of time and allow us to focus on what Google wants in creating high quality sites.
Thoughts?
-
-
I've thought about this idea to get rid of bad links from specific referrers.
I agree with the comments Ian has made but when it is the homepage it's not always possible to follow those steps. Has anyone else had any experience/information?
-
Hi Ian,
Thanks for the response. I agree with you that this feels wrong and I don't recommend anyone try this (unless you have a site that you don't mind using as a guinea pig).
The problem is that, if this is your home page or a strong category page with many legitimate links, rankings, ect., it will be tough to abandon the page altogether.
To follow up about the original idea, I have done some additional research and found a few mentions, though none of them exceptionally credible, of the fact that the Googlebot does not pass referrer data to the server. Can anyone confirm?
-
Ooooh, that feels all wrong.
Seems to me .htaccess would block visitors that you may want. And while Google may obey the directive, they may also misinterpret a referrer-specific directive as cloaking.
If you want to get the same effect, here's what I'd do:
- Change the page to which the offending links point. Have it say "This page is gone, but you can get the information you want here" and make that a link to a new page.
- Set up your server to return a 410 code when folks visit that page.
- Set up a new page with the old page's content, so folks can click from the old to the new, but visiting bots and browsers get a 410 code and dump the page.
That will get Google to de-index the page in a hurry.
Another option: Simply add noindex, nofollow to the targeted page.
But I like the 410 option, because that should break the authority flow and has the best chance of giving Google what it wants, short of removing the link.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Internal Links & Possible Duplicate Content
Hello, I have a website which from February 6 is keep losing positions. I have not received any manual actions in the Search Console. However I have read the following article a few weeks ago and it look a lot with my case: https://www.seroundtable.com/google-cut-down-on-similar-content-pages-25223.html I noticed that google has remove from indexing 44 out of the 182 pages of my website. The pages that have been removed can be considered as similar like the website that is mentioned in the article above. The problem is that there are about 100 pages that are similar to these. It is about pages that describe the cabins of various cruise ships, that contain one picture and one sentence of max 10 words. So, in terms of humans this is not duplicate content but what about the engine, having in mind that sometimes that little sentence can be the same? And let’s say that I remove all these pages and present the cabin details in one page, instead of 15 for example, dynamically and that reduces that size of the website from 180 pages to 50 or so, how will this affect the SEO concerning the internal links issue? Thank you for your help.
White Hat / Black Hat SEO | | Tz_Seo0 -
How many links can you have on sitemap.html
we have a lot of pages that we want to create crawlable paths to. How many links are able to be crawled on 1 page for sitemap.html
White Hat / Black Hat SEO | | imjonny0 -
Rank drop after link reclamation
Link reclamation is good activity interms of technical SEO and UX. But I noticed couple of times rank drop post the link reclamation activity. Why does this happen? What might be the cause? Beside redirecting to the most relevant page in contest to the source page content; anything else we must be looking into?
White Hat / Black Hat SEO | | vtmoz0 -
Penguin: Is there a "safe threshold" for commercial links?
Hello everyone, Here I am with a question about Penguin. I am asking to all Penguin experts on these forums to help me understand if there is a "safe" threshold of unnatural links under which we can have peace of mind. I really have no idea about that, I am not an expert on Penguin nor an expert of unnatural back link profiles. I have a website with about 84% natural links and 16% affiliate/commercial links. Should I be concerned about possibly being penalized by an upcoming Penguin update? So far, I have never been hit by any previous Penguin released, but... just in case, you experts, do you know what's the "threshold" of unnatural links that shouldn't be exceeded? Or, in your experience, what's the classic threshold over which Google can penalize a website for unnatural back link profile? Thank you in advance to anyone helping me on this research!
White Hat / Black Hat SEO | | fablau0 -
Diminishing Returns for Links to an Unrelated Page
Suppose I have a new website about cars and I had created a page about something completely not-related - like cupcakes. However, I found that it was very easy to get high quality sites to link to the cupcakes page where as it was very difficult to get people to link to the homepage about cars. If my goal is to increase the SEO for the homepage (which again is related to cars), is there a point where additional high quality links to my cupcakes page is not useful for it anymore? What if I created another page - about frosted cupcakes - which was also easy to get high quality links to?
White Hat / Black Hat SEO | | wlingke10 -
Do I need to undo a 301 redirect to dissavow links from the source domain?
A client came to me after being hit by Penguin and had already performed a 301 redirect from site A to Site B. Site B was subsequently hit by the penalty a number of weeks later and we are planing on performing link removal for Site A. Only the webmaster tools account for Site B exists, none is still available for site A. I assume that I cannot dissavow links to site A from Site B's webmaster tool account (even though website A's links show up in the GWT account). So do I need to undo the 301 and then create a new GWT account for site A in order to disavow the links pointing to site A, or can I submit from Site B's GWT account since they are 301'd to site B? Thanks! Chris [edited for formatting]
White Hat / Black Hat SEO | | SEOdub0 -
Will cleaning up old pr articles help serps?
For a few years we published articles with anchor text backlinks to about 10 different article submission sites. Each article was modified to create similar different articles. We have about 50 completely unique articles. This worked really well for our serps until google panda & penguin updates. I am looking for advice on whether I should have a major clean up of the published articles and if so should I be deleting them, removing or renaming anchor text backlinks? Any advice on what strategy would work best would be appreciated as I don't want to start deleting backlinks and making it worse. We used to enjoy position 1 but are now at 12-15 so have least most of our traffic.
White Hat / Black Hat SEO | | devoted2vintage0 -
Anchor text for internal links
there has been a lof of discussion on this forum and elsewhere about over optimized anchor text, partial match anchor text vs exact anchor text match, etc. I am wondering iwhether or not exact anchor text matches are good or bad for internal links? Does anyone have anythoughts, or better, any studies? Paul
White Hat / Black Hat SEO | | diogenes0