Blocking spammy links
-
i have a client that that google has said in a reconsideration request has spmmy links.
My question is can i simply block those request at the server, make the server drop the responce rather then go though the imposible process of getting them removed?
Does anyone have any expirence with this?
Thanks
-
Hey Alan -
Great question here. I honestly do not know if that htaccess trick will work. I've heard it postulated that it will, but have not seen anything conclusive on it! Unfortunately reinclusion requests take some time. If you think that you have been targeted unfairly, I'd recommend making as much noise publicly as possible. Get in front of Google and plead your case.
As far as I see it, here are your options for the spammy links -
-
If it's a KEYWORD-SPECIFIC penalty to that page, 404 the page to disavow the links and build a new page on your domain targeting that term. Build good links to this.
-
If it's a sitewide drop in traffic, then you need to remove as many of the links as possible. There are tools out there to help with this, but generally you need to find low-PA/DA, exact anchors. These are going to be the ones that are hurting you most. Also look for sitewides with exact anchors that can be seen as manipulative.
-
Google wants to see "good faith" in getting links removed. So document all sites that you contact, the sites that get back to you, the links you get removed, and the links you know about that you have not been able to get removed, including reasons for why (ie 4 emails sent, no reply, cannot find an email on WhoIs, etc). You have to be completely transparent.
One tip for you is to share all of these links in a Google Doc, which you link to from the reinclusion request. Then, shorten the link within the reinclusion using a bit.ly link so that you can see if/when they look at the links.
Good luck man. I know how hard it is to wait on hearing back, having gone through it a few times myself.
John
-
-
will do
-
Alan - I haven't seen any postings by anyone who has tried this but several people have suggested this tactic in various forums. If you try this, please share whether it works. Given the lack of success in response to deletion requests, a more direct route would be much more effective and preferred if G wants to favor it.
-
I am doing so as we speak, but i am a impatient SEO, i want to know if i am wasting my time, a reconsideration takes weeks
but i will let all know the results
-
Ok, I can see how something like that might work... it all depends how Google's crawlers look at links when it comes to penalties like this. I haven't heard of anyone who has tried this, so you might just have to test it to see if it works. It would make a very interesting case study.
-
It is not just trafic that is blocked link juce would also be blocked. Search engines would get 64 - Host not available.
In the article Matt states that if you remove the page that will work, this is simular.
i am not saying it will work, just that this would not just stop trafic it would stop any request.
i think google when finding a link will test to see that the link exists.so in that way I think it would work, but a thought just hit me, what if they say ok, this page does ot exist and remoe it from index?
But then the next time they found it it would be added again.
It would be the best approch rather then contact evey web site, and as Ryan Kent stated, he had a 14% sucess rate.
-
I don't think it will work, Google isn't looking at whether traffic from the spam sites is blocked or not. Just the fact that the spammy links exist is enough for Google to penalize you for link manipulation.
Matt Cutt's did say yesterday that they are considering adding a "disavow link" feature in Webmaster Tools, but you won't see it for another few months, if at all:
http://searchengineland.com/live-blog-you-a-with-matt-cutts-at-smx-advanced-123513
The best thing to do in the meantime is to just attempt to remove the links and file a request.
-
Thanks,
I plan to put it in my recon request, but to wait weeks for an answer, is a bit of a problem.
i would have thought the idea would have more mentions on the net, but i cant find anyone with expirence
-
If nothing else you can use it as proof in a reconsideration request.
I don't have clients, and never been involved with black hat tricks so I have personally never been in the position to have to deal with it, but I am for nothing else curious.
-
Thats what i have done, but i dont want to sit and wait, I would like to know if it works.
Cant seem to find much on the internet about it.
My fgirst thoughts were that it would work, but maybe google worries you will just remove the rules after penalty is lifted.
-
HTTP_REFERER matching maybe a possibility, but I don't know if Google will see you're dropping them or not
RewriteEngine on
RewriteCond %{HTTP_REFERER} stupidsite.com [NC]
RewriteRule .* - [F]I would be curious to see a case study on this.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
link building
I want to know is there any way of increasing DA/PA of the website? Is there any benefit in ranking if we increase PA of specific page? Recently my website DA is 6 i want to increase it 30.
Link Building | | hfameraya1980 -
Inbound link analysis
Hi all! we are doing some back link analysis on a clients competitors site. The competitor has roughly 300,000 links generated from an estimated 1,500 domain names. Is there an easy way to determine how many of the 300,000 links come from which of the 1,500 domain names? Essentially we want to see which of these domain names are generating the majority links? Is there a tool we can use to determine this? Or is it good old fashioned leg work? Ideas...? Thanks in advance 🙂 Phil.
Link Building | | Globalgraphics0 -
Are these links helping
Hi, We have had a few new links for a client and we are hoping that we can get some feedback on whether these have any link juice. We dont want to waste our time with links that dont benefit the website. Thanks http://thebestmealplancateringbiz.strikingly.com/blog/diet-meal-plan-delivery-services http://mealplanguide.home.blog/2018/11/02/importance-of-diet-meal-plans/ https://besthealthymeals.tumblr.com/post/179669447854/benefits-of-diet-meal-planning https://lauraberbenaj2.wixsite.com/mysite/blog/healthy-diet-meals-delivery
Link Building | | Caffeine_Marketing0 -
Link Auditing
Anyone have any useful ways to link audit? I'm stuck picking up the slack from a previous worker who decided buying links was a great idea for our clients.
Link Building | | CGR-Creative0 -
Self linking article
Every article on SearchEngineLand is self linked by it's title on H1 tag. I want to ask you if there is a SEO ranking benefit from it. I saw another sites that practice this tactic. Ex: http://searchengineland.com/forecast-google-lose-1-4-billion-pc-revenue-search-shifts-mobile-186504
Link Building | | fin0 -
Neighborhood links
Hi, If my company offers carpet cleaning services, what neighborhood links partner I need to find. Can anybody list down the neighborhood. Thanks.
Link Building | | younus0 -
Link Location: In main page or inner "Link" page?
My question is what´s best of Link Building: have it in main page or in internal Link Pages. I have the problem that have to reduce the number of outgoing links in my mainpage to rank better, but do not want to miss any linkjuice. How Linking in Main Page or Internal Pages affects SERP for your website? And should they be Follow or not Follow? Thanks in advance Maria Jesus
Link Building | | goperformancelabs0