Disavow Experts: Here's one for ya ....
-
Not sure how to handle this one. Simply because there are SO MANY .... I want to be careful not to do something stupid ...
Just a quick 3 minute video explanation: https://youtu.be/bVHUWTGH21E
I'm interested in several opinions so if someone replies - please still chime in.
Thanks.
-
No problem at all, happy to help. Unfortunately the best tools that we have to evaluate these are tools like Open Site Explorer which try to emulate how Google looks at links but they're imperfect for the very same reason that I can't possibly give you a definitive answer: Google doesn't want us to know!
Unfortunately, the only way we can ever know the outcome is to implement the change and see if the rankings get better or worse - welcome to the struggles of SEO!
If you really can't afford to be taking a hit right now but it would be more acceptable in a month or two (e.g. right now is your busiest period) I'd be inclined to wait. Otherwise, it's a tough call but I'd still lean toward having them removed. Don't forget that Google has been promising a Penguin (backlinks) update "very soon" all year! If that damn update finally rolls out tomorrow you may find yourself getting slammed by it... or it could roll out next year... or maybe it'll roll out and you'll be fine. Sigh.
We have had success in doing it steadily with one of our larger clients who were in a similar situation and the results were as good as we could have hoped for but YMMV. We essentially did the removal in stages. We divided the bad domains up into batches then contacted the first batch requesting removal then disavowing.
While all this was happening we also got to work building quality links to the site as well so they roughly cancelled each other out. Then we did the same thing with the other batches of bad links until we'd been through the lot.
For us, the end result was a series of fairly marginal peaks and troughs that directly correlated with link removal and link acquisition so the net position at any given time was approximately the same. I must stress though that YMMV here - since I have a total data sample of 2 domains (this client has 2 companies/sites), it's impossible for me to say with absolute certainty that what I saw is the direct result of our process.
-
Thank you so much. So that leaves the most important question. How do I know if these are benefiting me? I really can't afford to lose rankings right now, as we are in this situation due to already ruined rankings for an unknown reason. There are about 300 of them total. We have roughly 2,000 unique domains linking to us. So its a decent chunk.
Ironically their domain authority is "44" and mine is "45" .... the site has been online 16 years (with nary a design update apparently) ... their Moz domain authority is 37 whereas mine is 38. So ... I'm not sure if these guys are viewed terribly by Google or not...
There must be some way to ascertain what Google thinks of this site and its links... ?
-
The horrible thing about link removal is that it's often hard to give an accurate answer to this question. On one hand, directories, link farms etc are often ignored by search engines so having them may be doing you no harm. On the other hand, it's impossible to know if the specific domains you're looking at are actually being ignored or not.
In these scenarios I tend to lean towards having them removed anyway, just in case they are being counted. As you pointed out, there is a chance that removing them will remove some strength from your site and see you drop in rankings but since it's impossible to tell the outcome until it's too late, I'd rather risk being penalised for removing bad links than having them.
There are a few things you can do to make your life marginally easier here:
- Contact the site and ask them nicely to remove the links. They do have a phone number on the contact page, you'd be surprised how powerful a phone conversation can be vs yet another generic email.
- Export the list of referring domains (rather than links) and bulk-categorise in Excel as much as possible. Filter for words like fasthealth, seo, link, directory/directories etc and highlight them all for removal
- Disavow by domain rather than links. All you have to change in the disavow file is adding domain: to the beginning.
For example: domain:website.com.
If you do decide to give them a call or even email them, the best angle I've found is "I'm cleaning up the links in accordance with Google guidelines and have to be very picky with the ones I keep; this is no reflection on the quality of your site but I'd really appreciate it if you can remove them". Far more likely to get results than the attitude some people take of "hey scumbag, your horrible site is ruining my rankings, get rid of these spam links".
Also, the reason I say to export, evaluate and disavow at a domain level is simply a matter of volume. Rather than 20,000 spam links, you may only end up having to sift through 200 referring domains instead; far easier to manage. In my experience it's pretty rare that you'd want to disavow just one link from a site like these so doing it at the domain level disavows them all and protects you if they decide to change their URL structure in the future. A new URL structure would give you a link from a "new page" in the eyes of the search engine.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Has Anyone Encountered This Old Meta Tag and Know It's Past Function?
name="url" content="http://www.mysite.com/"> I've never personally seen it used until I saw a site using it this past weekend...I cannot find any old documentation on the purpose if this tag either.Any insights or direction would truly appreciated!Many thanks, T 😎
Intermediate & Advanced SEO | | talexanderyano0 -
URL Injection Hack - What to do with spammy URLs that keep appearing in Google's index?
A website was hacked (URL injection) but the malicious code has been cleaned up and removed from all pages. However, whenever we run a site:domain.com in Google, we keep finding more spammy URLs from the hack. They all lead to a 404 error page since the hack was cleaned up in the code. We have been using the Google WMT Remove URLs tool to have these spammy URLs removed from Google's index but new URLs keep appearing every day. We looked at the cache dates on these URLs and they are vary in dates but none are recent and most are from a month ago when the initial hack occurred. My question is...should we continue to check the index every day and keep submitting these URLs to be removed manually? Or since they all lead to a 404 page will Google eventually remove these spammy URLs from the index automatically? Thanks in advance Moz community for your feedback.
Intermediate & Advanced SEO | | peteboyd0 -
Alt tag for src='blank.gif' on lazy load images
I didn't find an answer on a search on this, so maybe someone here has faced this before. I am loading 20 images that are in the viewport and a bit below. The next 80 images I want to 'lazy-load'. They therefore are seen by the bot as a blank.gif file. However, I would like to get some credit for them by giving a description in the alt tag. Is that a no-no? If not, do they all have to be the same alt description since the src name is the same? I don't want to mess things up with Google by being too aggressive, but at the same time those are valid images once they are lazy loaded, so would like to get some credit for them. Thanks! Ted
Intermediate & Advanced SEO | | friendoffood0 -
Duplicate content URLs from bespoke ecommerce CMS - what's the best solution here?
Hi Mozzers Just noticed this pattern on a retail website... This URL product.php?cat=5 is also churning out products.php?cat=5&sub_cat= (same content as product.php?cat=5 but from this different URL - this is a blank subcat - there are also unique subcat pages with unique content - but this one is blank) How should I deal with that? and then I'm seeing: product-detail.php?a_id=NT001RKS0000000 and product-detail.php?a_id=NT001RKS0000000&cont_ref=giftselector (same content as product-detail.php?a_id=NT001RKS0000000 but from this different URL) How should I deal with that? This is a bespoke ecommerce CMS (unfortunately). Any pointers would be great 🙂 Best wishes, Luke
Intermediate & Advanced SEO | | McTaggart0 -
How do you find old linking url's that contain uppercase letters?
We have recently moved our back office systems, on the old system we had the ability to use upper and lower case letters int he url's. On the new system we can only use lower case, which we are happy with. However any old url's being used from external sites to link into us that still have uppercase letterign now hit the 404 error page. So, how do we find them and any solutions? Example: http://www.christopherward.co.uk/men.html - works http://www.christopherward.co.uk/Men.html - Fails Kind regards Mark
Intermediate & Advanced SEO | | Duncan_Moss0 -
Could a HTML <select>with large numbers of <option value="<url>">'s affect my organic rankings</option></select>
Hi there, I'm currently redesigning my website, and one particular pages lists hotels in New York. Some functionality I'm thinking of adding in is to let the user find hotels close to specific concert venues in New York. My current thinking is to provide the following select element on the page - selecting any one of the options will automatically redirect to my page for that concert venue. The purpose of this isn't to affect the organic traffic - I'm simply introducing this as a tool to help customers find the right hotel, but I certainly don't want it to have an adverse effect on my organic traffic. I'd love to know your thoughts on this. I must add that in certain cities, such as New York, there could be up to 450 different options in this select element. | <select onchange="location=options[selectedIndex].value;"> <option value="">Show convenient hotels for:</option> <option value="http://url1..">1492 New York</option> <option value="http://url2..">Abrons Arts Center</option> <option value="http://url3..">Ace of Clubs New York</option> <option value="http://url4..">Affairs Afloat</option> <option value="http://url5..">Affirmation Arts New York</option> <option value="http://url6..">Al Hirschfeld Theatre</option> <option value="http://url7..">Alice Tully Hall</option> .. .. ..</select> Many thanks Mike |
Intermediate & Advanced SEO | | mjk260 -
I'm facinated by SEO but the truth is, I don't have the time to do it. Who can I hire?
I'm facinated by SEO but the truth is, I don't have the time to do it. I trust the moz community more than some of those other SEO forums out there so I'm asking you all, where can I go to find a good SEO firm who's affordable enough for a small startup? The next part of the question is, what should I expect to pay for services that will really make a difference? Please don't spam this thread....I seriously just want an honest opinion as to where I can find some credible help.
Intermediate & Advanced SEO | | Chaz880 -
301 Redirect All Url's - WWW -> HTTP
Hi guys, This is part 2 of a question I asked before which got partially answered; I clicked question answered before I realized it only fixed part of the problem so I think I have to post a new question now. I have an apache server I believe on Host Gator. What I want to do is redirect every URL to it's corresponding alternative (www redirects to http). So for example if someone typed in www.mysite.com/page1 it would take them to http://mysite.com/page1 Here is a code that has made all of my site's links go from WWW to HTTP which is great, but the problem is still if you try to access the WWW version by typing it, it still works and I need it to redirect. It's important because Google has been indexing SOME of the URL's as http and some as WWW and my site was just HTTP for a long time until I made the mistake of switching it now I'm having a problem with duplicate content and such. Updated it in Webmaster Tools but I need to do this regardless for other SE's. Thanks a ton! RewriteEngine On RewriteBase / RewriteCond %{HTTP_HOST} ^www.yourdomain.com [NC] RewriteRule ^(.*)$ http://yourdomain.com/$1 [L,R=301]
Intermediate & Advanced SEO | | DustinX0