Removing Poison Links w/o Disavow
-
Okay so I've been working at resolving former black-hat SEO tactics for this domain for many many months. Finally our main keyword is falling down the rankings like crazy no matter how many relevant, quality links I bring to the domain. So I'm ready to take action today.
There is one inner-page which is titled exactly as the keyword we are trying to match. Let's call it "inner-page.html"
This page has nothing but poison links with exact match anchor phrases pointing at it. The good links I've built are all pointed at the domain itself.
So what I want to do is change the url of this page and let all of the current poison links 404. I don't trust the disavow tool and feel like this will be a better option. So I'm going to change the page's url to "inner_page.html" or in otherwords, simply changed to an underscore instead of a hyphen.
How effective do you think this will be as far as 404ing the bad links and does anybody out there have experience using this method? And of course, as always, I'll keep you all posted on what happens with this. Should be an interesting experiment at least.
One thing I'm worried about is the traffic sources. We seem to have a ton of direct traffic coming to that page. I don't really understand where or why this is taking place... Anybody have any insight into direct traffic sources to inner-pages? There's no reason for current clients to visit and potentials shouldn't be returning so often... I don't know what the deal is there but "direct" is like our number 2 or 3 traffic source. Am I shooting myself in the foot here?
Here we go!
-
Those are great suggestions Lynn, thank you.
I read the article about direct traffic and it made me feel a little better... My theory at this point is even if there are people out there bookmarking this URL (still not sure why they would and the % of new visitors is quite high on the direct source) they will find their way to the new one.
I do have a custom 404 page that is super helpful and should easily get people to their destination should they happen upon our old URL. It is a broad, site-wide 404 of course and not a specialized one for this page.. I didn't realize this was an option and it's an interesting thought. I will consider it. It does make me nervous. I want to get rid of every trace of this page as quickly as possible.
We are supplementing with a slight bump in PPC in the meantime. Luckily I have it in my budget to do so. And the thing is.. we are currently outranked by all of our competitors so it can't get much worse.
The real kicker here is all of our competitors are using blackhat tactics. It's extremely frustrating. Their links are coming from Bangladesh Travel Forums talking about hair products and linking to completely irrelevant pages with exact-match anchor phrasing. And there are thousands of them... It's been this way for many months and I keep thinking they'll get penalized but so far it's us falling in the rankings. Hopefully this makes a difference. We'll see --
One thing I do notice about the other blackhat sites is that they don't have any links pointing at internal pages, only the subdomain. Our former blackhat pointed at the internal page in question (and the subdomain as well) and while I've removed as many as possible it's still affecting us. The thing is, the other keywords I target that are just as competitive I am kicking butt in. Top 3 spots for several of them and they don't have any links pointing to the specific page targeting said keyword. So I hope that theory carries over to this primary keyword as well.
I'm babbling now. That's what I get for thinking about work on the weekend!
Thanks again and I'll keep the moz-community posted.
-
Hi Jesse,
If you change the url even a bit and let the old one 404 then you will accomplish what you want in terms of cutting the bad incoming links, so if all of them are poison as you say, then this is probably a logical option. You could also consider a 410 response which might remove the page from the index faster and is considered more permanent (gone forever).
In terms of the incoming traffic I would keep two things in mind.
1. Firstly it would be nice to identify where that direct traffic is coming from as much as possible. Check out this article for a couple of ideas on what traffic might be hiding behind those numbers and apply to your situation as relevant. If you have a couple of days/weeks to be patient you can manually tag some of the likely sources to see what data that gives you.
2. If possible consider making a custom 404/410 page for this instance giving real users a link to the new page. Not 100% sure on the technicalities of how google will assess a link to the new page from the now old page which is returning a 4xx status. You could meta tag the 404 noindex, follow or even noindex, nofollow I suppose to further enforce the disconnect between the old and the new while still keeping the link available for human visitors.
Hope that helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Subdomain and Domain Linking Strategy
Here is my question for SEO. We are a mug printing company and we have a website specifically for bulk orders hosted at our main link (example.com). For the purposes of this example we will assume that we only print mugs for bands. Eg. orders for 100 mugs at a time for a band. We have had a need to create stores for bands so that they can then pass a link to their fans to purchase mugs. Our main website deals specifically with bulk orders only with customer provided logos, so extending this workflow to our main domain takes quite a bit of development time. Because of this, we purchased a service that allows us to create stores under the new domain stores.example.com. The root domain is the same as our main domain but there is “stores” in front of the domain. A band’s website that we would create would then look something like : stores.example.com/band1_merchandise These links are going to be spread by the band all over the web, and it is in my hope to be able to take advantage of this. Ideally stores.example.com/band1_merchandise being spread around will also give us a boost to www.example.com My question is how can we benefit the most from bands sharing the subdomain link such that our main website will be able to see an SEO benefit.
White Hat / Black Hat SEO | | masonwong0 -
Should I disavow links to a dead sub domain?
I'm analyzing a client's website today and I found that they have over 300 spammy sites linking to a subdomain of their main site. So for example, say their site is clientsite.com, well they have hundreds of links pointing to deadsite.clientsite.com. That subdomain was used at one time as a staging site, and is no longer active. Are those hundreds of spammy sites hurting or potentially hurting my client's SEO? Or is it a non-issue because the links point to a dead subdomain? We believe that that staging sub domain site was hacked at one time, and thats where all those spammy links came from. Should I disavow them?
White Hat / Black Hat SEO | | rubennunez0 -
Deep Link Ratio
Hi there, What ratio links should be to a homepage compared to deep links? I'm aware there probably isn't a fixed ratio, and it may depend on niche, but i've heard Penguin is on the look out for people that link to heavily to content deep in their sites (product pages etc.) Any thoughts?
White Hat / Black Hat SEO | | jennie.evans0 -
Is there any reason to Nofollow Internal Links or XML Sitemap?
I am viewing a new client's site and they have the following nofollow(S) on their site homepage. Is there a reason for this? Also, they people who originally built their site have a footer link on every page to their company (I guess to promote their work). They didn't "nofollow" that link lol... What are the thoughts on footer links? About Us Privacy Policy Customer Service Shipping & Returns Blog Contact Us Site Map Thanks James Chronicle
White Hat / Black Hat SEO | | Atlanta-SMO0 -
How to ignore spam links to page?
Hey Moz pals, So for some reason someone is building thousands of links to my websites (all spam), likely someone doing negative seo on my site. Anyway, all these links are pointing to 1 sub url on my domain. That url didn't have anything on it so I deleted the page so now it comes up with a 404. Is there a way to reject any link that ever gets built to that old page? I don't want all this spam to hurt my website. What do you suggest?
White Hat / Black Hat SEO | | WongNs0 -
Domain authority - Low quality links
I have a question I hope people can help me on. it is my intention for my next project to focus on domain authority, and a small number of high quality links. I have a couple of scenarios I would appreciate some advice on: 1. Can lower quality links lower domain authority? 2. Would you avoid links from low quality sites no matter what \ what domain authority levels should you avoid links from. 3. Should I be looking at link profiles of the sites I get links from. Does it matter if a site I get a link from has 1000's of spammy links (i.e. something to look out for when doing guest blogging). 4. Should I avoid directories no matter what, or is high pr \ domain authority directories ok to use, if I end up on a page of other relevant directory submissions related to my niche. Essentially, my aim is to have high quality links, but equally, there are some decent sites on the fringes that I will need to consider (based on a competitors link profile I researches).
White Hat / Black Hat SEO | | Jonathan19790 -
Are these links bad for my results?
In the past we have requested links on multiple directories. Since we have seen a mayor drop (60% in traffic) in results around the pinquin update 24-26th of April. Our results have been slowly getting lower and lower in Google. Is it possible to tell if these links are in fact doing my site harm? Before the 26th of April it was easy to see that the results where benefiting from the submission to those directories. We did not have any messages in webmaster tools and reconsideration says "no manual spam action taken". What would be the best strategy to turn this around and go up again? A selection of the requested links can be found below. <colgroup><col width="266"></colgroup>
White Hat / Black Hat SEO | | 2Hillz
| www.thesquat.org |
| www.directmylink.com |
| www.thegreatdirectory.org |
| www.submission4u.com |
| www.urlmoz.com |
| www.basoti.org |
| www.iwebdirectory.co.uk |
| www.freeinternetwebdirectory.com |
| addsite-submitfree.com |
| opendirectorys.com |
| www.xennobb.com |
| mdwerks.com |
| www.directoryfire.com |
| www.rssbuffet.com | To give a good view on the problem: The requested links anchors are mostly not in the native language of the directories. Thanks!0 -
Deny visitors by referrer in .htaccess to clean up spammy links?
I want to lead off by saying that I do not recommend trying this. My gut tells me that this is a bad idea, but I want to start a conversation about why. Since penguin a few weeks ago, one of the most common topics of conversation in almost every SEO/Webmaster forum is "how to remove spammy links". As Ryan Kent pointed out, it is almost impossible to remove all of these links, as these webmasters and previous link builders rarely respond. This is particularly concerning given that he also points out that Google is very adamant that ALL of these links are removed. After a handful of sleepless nights and some research, I found out that you can block traffic from specific referring sites using your.htaccess file. My thinking is that by blocking traffic from the domains with the spammy links, you could prevent Google from crawling from those sites to yours, thus indicating that you do not want to take credit for the link. I think there are two parts to the conversation... Would this work? Google would still see the link on the offending domain, but by blocking that domain are you preventing any strength or penalty associated with that domain from impacting your site? If for whatever reason this would nto work, would a tweak in the algorithm by Google to allow this practice be beneficial to both Google and the SEO community? This would certainly save those of us tasked with cleaning up previous work by shoddy link builders a lot of time and allow us to focus on what Google wants in creating high quality sites. Thoughts?
White Hat / Black Hat SEO | | highlyrelevant0