Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Deny visitors by referrer in .htaccess to clean up spammy links?
-
I want to lead off by saying that I do not recommend trying this. My gut tells me that this is a bad idea, but I want to start a conversation about why.
Since penguin a few weeks ago, one of the most common topics of conversation in almost every SEO/Webmaster forum is "how to remove spammy links". As Ryan Kent pointed out, it is almost impossible to remove all of these links, as these webmasters and previous link builders rarely respond. This is particularly concerning given that he also points out that Google is very adamant that ALL of these links are removed.
After a handful of sleepless nights and some research, I found out that you can block traffic from specific referring sites using your.htaccess file. My thinking is that by blocking traffic from the domains with the spammy links, you could prevent Google from crawling from those sites to yours, thus indicating that you do not want to take credit for the link.
I think there are two parts to the conversation...
-
Would this work? Google would still see the link on the offending domain, but by blocking that domain are you preventing any strength or penalty associated with that domain from impacting your site?
-
If for whatever reason this would nto work, would a tweak in the algorithm by Google to allow this practice be beneficial to both Google and the SEO community? This would certainly save those of us tasked with cleaning up previous work by shoddy link builders a lot of time and allow us to focus on what Google wants in creating high quality sites.
Thoughts?
-
-
I've thought about this idea to get rid of bad links from specific referrers.
I agree with the comments Ian has made but when it is the homepage it's not always possible to follow those steps. Has anyone else had any experience/information?
-
Hi Ian,
Thanks for the response. I agree with you that this feels wrong and I don't recommend anyone try this (unless you have a site that you don't mind using as a guinea pig).
The problem is that, if this is your home page or a strong category page with many legitimate links, rankings, ect., it will be tough to abandon the page altogether.
To follow up about the original idea, I have done some additional research and found a few mentions, though none of them exceptionally credible, of the fact that the Googlebot does not pass referrer data to the server. Can anyone confirm?
-
Ooooh, that feels all wrong.
Seems to me .htaccess would block visitors that you may want. And while Google may obey the directive, they may also misinterpret a referrer-specific directive as cloaking.
If you want to get the same effect, here's what I'd do:
- Change the page to which the offending links point. Have it say "This page is gone, but you can get the information you want here" and make that a link to a new page.
- Set up your server to return a 410 code when folks visit that page.
- Set up a new page with the old page's content, so folks can click from the old to the new, but visiting bots and browsers get a 410 code and dump the page.
That will get Google to de-index the page in a hurry.
Another option: Simply add noindex, nofollow to the targeted page.
But I like the 410 option, because that should break the authority flow and has the best chance of giving Google what it wants, short of removing the link.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Sitewide nav linking from subdomain to main domain
I'm working on a site that was heavily impacted by the September core update. You can see in the attached image the overall downturn in organic in 2019 with a larger hit in September bringing Google Organic traffic down around 50%. There are many concerning incoming links from 50-100 obviously spammy porn-related websites to just plain old unnatural links. There was no effort to purchase any links so it's unclear how these are created. There are also 1,000s of incoming external links (most without no-follow and similar/same anchor text) from yellowpages.com. I'm trying to get this fixed with them and have added it to the disavow in the meantime. I'm focusing on internal links as well with a more specific question: If I have a sitewide header on a blog located at blog.domain.com that has links to various sections on domain.com without no-follow tags, is this a possible source of the traffic drops and algorithm impact? The header with these links is on every page of the blog on the previously mentioned subdomain. **More generally, any advice as to how to turn this around? ** The website is in the travel vertical. 90BJKyc
White Hat / Black Hat SEO | | ShawnW0 -
Is this campaign of spammy links to non-existent pages damaging my site?
My site is built in Wordpress. Somebody has built spammy pharma links to hundreds of non-existent pages. I don't know whether this was inspired by malice or an attempt to inject spammy content. Many of the non-existent pages have the suffix .pptx. These now all return 403s. Example: https://www.101holidays.co.uk/tazalis-10mg.pptx A smaller number of spammy links point to regular non-existent URLs (not ending in .pptx). These are given 302s by Wordpress to my homepage. I've disavowed all domains linking to these URLs. I have not had a manual action or seen a dramatic fall in Google rankings or traffic. The campaign of spammy links appears to be historical and not ongoing. Questions: 1. Do you think these links could be damaging search performance? If so, what can be done? Disavowing each linking domain would be a huge task. 2. Is 403 the best response? Would 404 be better? 3. Any other thoughts or suggestions? Thank you for taking the time to read and consider this question. Mark
White Hat / Black Hat SEO | | MarkHodson0 -
Advice needed! How to clear a website of a Wordpress Spam Link Injection Google penalty?
Hi Guys, I am currently working on website that has been penalised by Google for a spam link injection. The website was hacked and 17,000 hidden links were injected. All the links have been removed and the site has subsequently been redesigned and re-built. That was the easy part 🙂 The problems comes when I look on Webmaster. Google is showing 1000's of internal spam links to the homepage and other pages within the site. These pages do not actually exist as they were cleared along with all the other spam links. I do believe though this is causing problems with the websites rankings. Certain pages are not ranking on Google and the homepage keyword rankings are fluctuating massively. I have reviewed the website's external links and these are all fine. Does anyone have any experience of this and can provide any recommendations / advice for clearing the site from Google penalty? Thanks, Duncan
White Hat / Black Hat SEO | | CayenneRed890 -
Hiding content or links in responsive design
Hi, I found a lot of information about responsive design and SEO, mostly theories no real experiment and I'd like to find a clear answer if someone tested that. Google says:
White Hat / Black Hat SEO | | NurunMTL
Sites that use responsive web design, i.e. sites that serve all devices on the same set of URLs, with each URL serving the same HTML to all devices and using just CSS to change how the page is rendered on the device
https://developers.google.com/webmasters/smartphone-sites/details For usability reasons sometimes you need to hide content or links completely (not accessible at all by the visitor) on your page for small resolutions (mobile) using CSS ("visibility:hidden" or "display:none") Is this counted as hidden content and could penalize your site or not? What do you guys do when you create responsive design websites? Thanks! GaB0 -
What means a back door link. Please explain and I will give you credit
Some one is asking me to do a back door link to each other, what dose it mean?
White Hat / Black Hat SEO | | Joseph-Green-SEO0 -
Disavow - Broken links
I have a client who dealt with an SEO that created not great links for their site. http://www.golfamigos.co.uk/ When I drilled down in opensiteexplorer there are quite a few links where the sites do not exist anymore - so I thought I could test out Disavow out on them .. maybe just about 6 - then we are building good quality links to try and tackle this problem with a more positive approach. I just wondered what the consensus was?
White Hat / Black Hat SEO | | lauratagdigital0 -
Does anyone have any suggestions on removing spammy links?
I have some clients that recently got hit by "Penguin" they have several less than desireable backlinks that could be the issue? Does anyone have any suggestions on getting these removed? What are the odds that a webmaster on these spammy sites are going to remove them, and is it worth the time and effort?
White Hat / Black Hat SEO | | RonMedlin3 -
Partners and Customers logo listing and links
We have just created a program where we list the customers that use our software and a link to their websites on a new "Customers" page. We expect to have upwards of 100 logos with links back to their sites. I want to be sure this isn't bordering on gray or black hat link building. I think it is okay since they are actual users of our software. But there is still that slight doubt. Along these same lines, would you recommend adding a nofollow or noindex tag? Thanks for your help.
White Hat / Black Hat SEO | | PerriCline0