Blocking spammy links
-
i have a client that that google has said in a reconsideration request has spmmy links.
My question is can i simply block those request at the server, make the server drop the responce rather then go though the imposible process of getting them removed?
Does anyone have any expirence with this?
Thanks
-
Hey Alan -
Great question here. I honestly do not know if that htaccess trick will work. I've heard it postulated that it will, but have not seen anything conclusive on it! Unfortunately reinclusion requests take some time. If you think that you have been targeted unfairly, I'd recommend making as much noise publicly as possible. Get in front of Google and plead your case.
As far as I see it, here are your options for the spammy links -
-
If it's a KEYWORD-SPECIFIC penalty to that page, 404 the page to disavow the links and build a new page on your domain targeting that term. Build good links to this.
-
If it's a sitewide drop in traffic, then you need to remove as many of the links as possible. There are tools out there to help with this, but generally you need to find low-PA/DA, exact anchors. These are going to be the ones that are hurting you most. Also look for sitewides with exact anchors that can be seen as manipulative.
-
Google wants to see "good faith" in getting links removed. So document all sites that you contact, the sites that get back to you, the links you get removed, and the links you know about that you have not been able to get removed, including reasons for why (ie 4 emails sent, no reply, cannot find an email on WhoIs, etc). You have to be completely transparent.
One tip for you is to share all of these links in a Google Doc, which you link to from the reinclusion request. Then, shorten the link within the reinclusion using a bit.ly link so that you can see if/when they look at the links.
Good luck man. I know how hard it is to wait on hearing back, having gone through it a few times myself.
John
-
-
will do
-
Alan - I haven't seen any postings by anyone who has tried this but several people have suggested this tactic in various forums. If you try this, please share whether it works. Given the lack of success in response to deletion requests, a more direct route would be much more effective and preferred if G wants to favor it.
-
I am doing so as we speak, but i am a impatient SEO, i want to know if i am wasting my time, a reconsideration takes weeks
but i will let all know the results
-
Ok, I can see how something like that might work... it all depends how Google's crawlers look at links when it comes to penalties like this. I haven't heard of anyone who has tried this, so you might just have to test it to see if it works. It would make a very interesting case study.
-
It is not just trafic that is blocked link juce would also be blocked. Search engines would get 64 - Host not available.
In the article Matt states that if you remove the page that will work, this is simular.
i am not saying it will work, just that this would not just stop trafic it would stop any request.
i think google when finding a link will test to see that the link exists.so in that way I think it would work, but a thought just hit me, what if they say ok, this page does ot exist and remoe it from index?
But then the next time they found it it would be added again.
It would be the best approch rather then contact evey web site, and as Ryan Kent stated, he had a 14% sucess rate.
-
I don't think it will work, Google isn't looking at whether traffic from the spam sites is blocked or not. Just the fact that the spammy links exist is enough for Google to penalize you for link manipulation.
Matt Cutt's did say yesterday that they are considering adding a "disavow link" feature in Webmaster Tools, but you won't see it for another few months, if at all:
http://searchengineland.com/live-blog-you-a-with-matt-cutts-at-smx-advanced-123513
The best thing to do in the meantime is to just attempt to remove the links and file a request.
-
Thanks,
I plan to put it in my recon request, but to wait weeks for an answer, is a bit of a problem.
i would have thought the idea would have more mentions on the net, but i cant find anyone with expirence
-
If nothing else you can use it as proof in a reconsideration request.
I don't have clients, and never been involved with black hat tricks so I have personally never been in the position to have to deal with it, but I am for nothing else curious.
-
Thats what i have done, but i dont want to sit and wait, I would like to know if it works.
Cant seem to find much on the internet about it.
My fgirst thoughts were that it would work, but maybe google worries you will just remove the rules after penalty is lifted.
-
HTTP_REFERER matching maybe a possibility, but I don't know if Google will see you're dropping them or not
RewriteEngine on
RewriteCond %{HTTP_REFERER} stupidsite.com [NC]
RewriteRule .* - [F]I would be curious to see a case study on this.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Ran a backlink competitor analysis and found that the competitor has their links coming from various linkedin and youtube posts but my client does the same thing and im not seeing any links coming from those sources for them?
Ran a backlink competitor analysis and found that the competitor has their links coming from various linkedin and youtube posts but my client does the same thing and im not seeing any links coming from those sources for them?
Link Building | | Stongex0 -
Internal links: How do I find keywords that are not linked to a URL?
Hi Moz members I'd really like to place an internal link of every instance of an important keyword phrase "cycling caps" or "cycling cap" to my eComm category page
Link Building | | andystorey
https://www.prendas.co.uk/collections/headwear/cotton-caps I feel this would not only help my customers browsing my store, but I believe it will help from an SEO perspective. How can I search my site using Moz, Screaming Frog, Google etc to find every time cycling cap(s) is used but is not linked to the above URL? I can then apply the same for cycling jerseys, socks, etc Andy0 -
Subdomains and link juice
My site recently received two separate links from the same domain, but different subdomains. Will my site receive "full" link juice from each link, or do they water each other down? Thanks!
Link Building | | SRichmond0 -
I want to design SEO link building strategy for my website? Is wordpress.com, squidoo, tumblr, blogger, typepad - Good option for link building?
I am currently concentrating on 8 keywords, for e.g (A, B, C, D, E, F, G, H). I will be writing blogs with 2 of any related keywords present in it. I am thinking to post 5 blogs on 5 different platforms as (wordpress.com, Squidoo, Typepad, Tumblr, blogger) respectively. I am thinking a strategy as: Monday: Keyword A,B on Wordpress/ C,D on Squidoo/E,F on Typepad/G,H on Blogger/ A,C on Tumblr. Tuesday:C,D on Wordpress/E,F on Squidoo/ etc.... and will rotate these keywords through out the week and the cycle restarts on Monday. The URL for every keyword will be different and relevant to that keyword. I need quick suggestion on this topic..Please..
Link Building | | Christain0 -
Editorial links
http://www.wired.com/culture/lifestyle/news/1999/01/17484 how do you get a link from an article like this where the author writes for a high domain authority site and just ends up listing a lot of links to different companies?
Link Building | | JohnWalker0 -
Link building
Hello all ! I would love some opinions about one Link Building technical aspect from SEOs, link builders ... How important is the number of outgoing links on the page you have the opportunity to place a link ? (from a scale of 1-10 / what about where dose this factor fits overall with your check list , as important to consider as far as link building) Would you place a link on a page with 200+ outgoing external links if the page is ok (not involved in selling links, all links are related with the subject, target page, anchor text etc) or you would go with a link on a 10-15 links per page but not that 'powerful' and not that related (not spammy but more general on the subject that can contain links to some other non related resources). Eg: Option 1: www.whatever-name.com/links.html the entire domain is about x. It has 250 links in that page (in the links.html) all links on the links.html pages are somehow related / resources with x. Your link can be placed with an anchor text related with x that will target a page on your site related with x. Domain page rank : 6 (as a general indicator), Page PR, the links.html one, PR: 2 Option 2: www.some-other-name.com/x-is-so-great the entire domain is NOT about x, just the page. It has 5..10 links in that page (external links more or less related with x - but not spammy) Your link can be placed with an anchor text related with x that will target a page on your site related with x. Domain page rank : 3 (as a general indicator), Page PR, the links.html one, PR: 1 What's your best option ? Thanks !
Link Building | | eyepaq1 -
Links below the snippet
Hello, Does anybody know how can i obtain links under the snippet (see the example in the image attached) ? Thank you in advance! Cosmin zXhsG
Link Building | | cosmin_bicoiu0 -
Link Ratio
Does Google (or other search engines) penalize websites if they have a too many inbound links from a single domain? (Say for instance, I have 500 inbound links, and 400 of them are from one root domain.)
Link Building | | EricVallee340