Blocking spammy links
-
i have a client that that google has said in a reconsideration request has spmmy links.
My question is can i simply block those request at the server, make the server drop the responce rather then go though the imposible process of getting them removed?
Does anyone have any expirence with this?
Thanks
-
Hey Alan -
Great question here. I honestly do not know if that htaccess trick will work. I've heard it postulated that it will, but have not seen anything conclusive on it! Unfortunately reinclusion requests take some time. If you think that you have been targeted unfairly, I'd recommend making as much noise publicly as possible. Get in front of Google and plead your case.
As far as I see it, here are your options for the spammy links -
-
If it's a KEYWORD-SPECIFIC penalty to that page, 404 the page to disavow the links and build a new page on your domain targeting that term. Build good links to this.
-
If it's a sitewide drop in traffic, then you need to remove as many of the links as possible. There are tools out there to help with this, but generally you need to find low-PA/DA, exact anchors. These are going to be the ones that are hurting you most. Also look for sitewides with exact anchors that can be seen as manipulative.
-
Google wants to see "good faith" in getting links removed. So document all sites that you contact, the sites that get back to you, the links you get removed, and the links you know about that you have not been able to get removed, including reasons for why (ie 4 emails sent, no reply, cannot find an email on WhoIs, etc). You have to be completely transparent.
One tip for you is to share all of these links in a Google Doc, which you link to from the reinclusion request. Then, shorten the link within the reinclusion using a bit.ly link so that you can see if/when they look at the links.
Good luck man. I know how hard it is to wait on hearing back, having gone through it a few times myself.
John
-
-
will do
-
Alan - I haven't seen any postings by anyone who has tried this but several people have suggested this tactic in various forums. If you try this, please share whether it works. Given the lack of success in response to deletion requests, a more direct route would be much more effective and preferred if G wants to favor it.
-
I am doing so as we speak, but i am a impatient SEO, i want to know if i am wasting my time, a reconsideration takes weeks
but i will let all know the results
-
Ok, I can see how something like that might work... it all depends how Google's crawlers look at links when it comes to penalties like this. I haven't heard of anyone who has tried this, so you might just have to test it to see if it works. It would make a very interesting case study.
-
It is not just trafic that is blocked link juce would also be blocked. Search engines would get 64 - Host not available.
In the article Matt states that if you remove the page that will work, this is simular.
i am not saying it will work, just that this would not just stop trafic it would stop any request.
i think google when finding a link will test to see that the link exists.so in that way I think it would work, but a thought just hit me, what if they say ok, this page does ot exist and remoe it from index?
But then the next time they found it it would be added again.
It would be the best approch rather then contact evey web site, and as Ryan Kent stated, he had a 14% sucess rate.
-
I don't think it will work, Google isn't looking at whether traffic from the spam sites is blocked or not. Just the fact that the spammy links exist is enough for Google to penalize you for link manipulation.
Matt Cutt's did say yesterday that they are considering adding a "disavow link" feature in Webmaster Tools, but you won't see it for another few months, if at all:
http://searchengineland.com/live-blog-you-a-with-matt-cutts-at-smx-advanced-123513
The best thing to do in the meantime is to just attempt to remove the links and file a request.
-
Thanks,
I plan to put it in my recon request, but to wait weeks for an answer, is a bit of a problem.
i would have thought the idea would have more mentions on the net, but i cant find anyone with expirence
-
If nothing else you can use it as proof in a reconsideration request.
I don't have clients, and never been involved with black hat tricks so I have personally never been in the position to have to deal with it, but I am for nothing else curious.
-
Thats what i have done, but i dont want to sit and wait, I would like to know if it works.
Cant seem to find much on the internet about it.
My fgirst thoughts were that it would work, but maybe google worries you will just remove the rules after penalty is lifted.
-
HTTP_REFERER matching maybe a possibility, but I don't know if Google will see you're dropping them or not
RewriteEngine on
RewriteCond %{HTTP_REFERER} stupidsite.com [NC]
RewriteRule .* - [F]I would be curious to see a case study on this.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
We're looking at providing SEO for a website that has the majority of its incoming links from websites created solely to provide links. Few have bad spam rankings. How worried should I be about those links?
The majority of incoming links to a prospect's website are from website pages apparently created solely to provide links to the website. Few have high spam scores. The sites linking to the main site have versions of blogs with linked text. They seem to be providing positive SEO value now, but I'm concerned they might get noticed and hurt the main site in the future.
Link Building | | PKI_Niles1 -
Paginated List Links
In a rel=prev rel=next set of paginated pages. Is there a benefit to have prev and next links in addition to the numbered links on the page. Or will the numbered links suffice? Any help would be most welcome.
Link Building | | Andrew-SEO0 -
I am looking for a software to keep track of the back links or link building
Hi Guys, I am very new hear. I just joined. I have a few websites that I need to manage. I am looking for a software that will allow me to manually add the links that I have been able to acquire. It would also be a bonus if it could search for the links to my site and to see if its still active. Because I have a freelancer working a few hours a week on link building and I need to keep track of the links. I am looking for something simple and good. Any suggestions? By the way, I used Raven Tools many years ago, and it had this feature. But since joining Moz, I don't want to be paying for 2 subscriptions. And I don't know if Raven still has this feature. Your suggestions would be appreciated. Thank you. Ryan
Link Building | | RyanNewman0 -
Spammy links
Hi Guys, I have a case which seems to occur more often for our customers. The websites of our customers seem to receive tons of backlinks from websites all over the world (China, Russia, Ukrain, etc). It’s spam we never asked for, we didn’t buy any dodgy linkbuilding packages or anything. Do any of you guys have experience with this matter? We try to disavow the links but it takes too much time and we will never manage to disavow 100% of all links. Examples are www.keukensduitsland.nl and www.m2beveiliging.nl Hope anyone has experience and maybe even solutions for this matter. Thanks!
Link Building | | Happy-SEO1 -
Does nofollow link has any effect on link building
I recently read some article on noflollow link and the author says that nofollow link does have some effect on page rank. Can anyone explain the effect of nofollow link on website ranking?
Link Building | | petwho0 -
Internal linking anchor text with automated ASP.NET link building
Hi Everyone I really need some help here, the problem I have must be one that many have. I have a simple e-commerce style website so 1 product page can in fact get 40-50 internal links to it. These links come from a mixture of: 1. The parent category pages that the product sits on (Rugged PDA) and in turn the 10 filter pages of this category page (Rugged PDA, ordered by battery size). 2. Alternative product list on other product pages, So many products link to each other as alternatives. From Google analytics we can see that visitors like to browse product to product seeing 5 alternatives on each page with titles like "Smaller", "more rugged" etc. 3. Manufactuer pages, so we have a link to each product from each manufacturer home page where we talk lots about each manufacture we resell. We also have links from images used in the website. So its a nice usable website but we're finding that Google is still telling us in Webmaster tools that it thinks some links are dubious and we're trying to find out why. We only now have 190 external links to the website, most are internal and from the website or our blog on a subdomain. The problem we think is that we generate the category and products pages all dynamically so the anchor text is looking the same. Will this potentially create issues for us? Dave
Link Building | | Raptor-crew0 -
How Do I Know which links are bad?
Hi All I have currently got a Manual Action on my Webmaster Tools account due to some unnatural links and I am in the process of seeding out the poor links with a view to having them removed or disavowed. Other than looking at each link individually to see if it looks relevant or spammy. Any ideas on how I can gauge which links are bad and which aren't??
Link Building | | Pete40 -
C-blocks and linking
I'm just starting to learn about cblock root domains and linking. I'm wondering about the following: Are there articles or websites to learn about dos and don'ts about getting a large number of links from one cblock? How do I know when a site has too many links versus the number of unique cblocks? I've been using Link Harvester on SEO Book and it seems like not many sites have more than 170 unique cblocks. Is this the total number of unique cblocks on the web or am I using the tool wrong?
Link Building | | EricVallee340