I am cleaning up a clients link profile and am coming across a lot of directories (no surprise) My question is if an obvious fre for all generic directory doesn't look to have been hit by any updates is it a wise move recommending tit for removal?
-
I am cleaning up a clients link profile and am coming across a lot of directories (no surprise)
My question is, if an obvious free for all generic directory doesn't look to have been hit by any updates is it a wise move recommending it for removal on the basis that it is a free for all directory and could be hit in teh future?
-
I agree with Mark Scully on this one, but would like to add some thoughts:
If you are looking to clean out your backlink profile you should go about it in a very methodical fashion. I would recommend exporting the links to an Excel file and then, in a new sheet, start skimming and categorizing them -needs more research; relevant; potentially harmful; show stopper. It will be time consuming but once you have a basic categorization set you can start reaching out.
There is a real possibility that many of the directory links are from neglected and orphaned directories and that the contact e-mail may not be in operation anymore. When you find this to be the case, note it on your categorized Excel sheet. Note the date you sent the link removal request and note the response; if there is no response, note that as well. Be realistic concerning the expected reply time (this is a big deal to you; it is probably not a big deal to those hosting the directories) and send out second and third requests.
If it was me, I would concentrate on the two most harmful categories and give them a real thorough going through. After a few weeks (I know, it's a long-ish project) you should have a nice detailed actions-taken report and should feel comfortable utilizing the disavow links tool if needed.
Note: This tool, from what I understand, is not a click-and-fix and you will need to have a file of the links you would like disavowed to upload to Google for review. Barry Schwartz, over at seroundtable.com, has a nice post concerning this and he supplies an example of what a disavow report might look like:
Watch the video by Matt Cutts explaining the tool and use it with caution and only as a last resort; don't spam them with reports.
One final note: Some of these links may not be harming you as of now. Use your best judgement and ask yourself this question: "if I knew another penguin update was coming tomorrow, would having this link cause me to worry?" It isn't always a straightforward answer, but if you find yourself stretching and searching for a rational to view the link as relevant or user-centric, then it probably isn't.
I am sure there is plenty more to say on the topic, and I hope some others chime in with their thoughts. It's time to earn that paycheck.
Keep us posted, and happy digging.
-
Good point Mark that seems a much safer approach.
-
Hi Mark,
Just to clarify, the complete number of backlinks to their site is 13500? I would be quite cautious about deleting 90% of them. I'm sure some of them stand out as more toxic than others. It would be worth focusing on them first.
I know a lot of people have mixed opinions about link cleanup (whether it should be done or not) but if you managed to delete even half of the poor quality links to the site, it should be a clear enough message to Google that you're taking the warning seriously.
If a re-inclusion request fails, you could go deeper then.
-
Hi Mark
Thats kind of what I am thinking. I am going through 13500 links at the moment and it is killing me. Seeing directory after directory is very painful.
Upto now im looking at killing around 90% of the links for this particluar client as they are made up from these types of directories.
Althoughs ome of them still show very high DA and PA aswell as high TBPR in my heart I can't see how they could possibly add value to a users experience as I can't see why anybody would use them to find anything. Everybody knows that these types of directories exist for the sole purpose of obtaining links so surely it would make sense to kill the link even if it is helping at the minute?
-
Hi Mark,
I've had to do a lot of backlink analysis and removal before so this is my view.
If the directory lists links in an unnatural looking manner (i.e. just a long list with little text about the link), I would remove it. Some directories have managed to avoid any algorithm updates for now but I'm sure they will eventually get hit.
The volume of link removal you do will really depend on how large your back link profile is. I had to work through about 20,000 links which needed to be removed as they were from low quality article sites and directories. We received the unnatural link warning in GWMT and filed a re-inclusion request. This got turned down and so we had to dig even deeper into the links pointing to our site.
Just be consious of how many 'good' links you do have. If you go straight into removing a lot of directory links and leave yourself with very few 'good' links to your site, it could be an issue for you. It's really your call.
Personally, I'd remove them if the directory looks poor, has no social media presence and looks spammy.
-
Personally, if a site has been hit with a warning, then I would go through and remove everything that isn't a decent link back and I would be targeting directories as well - but this wouldn't be a complete removal - I would need to look at each first. Saying that, if I see www.greatbigdirectory4u.com, then this sort get immediate removal.
I'm not saying that every directory is a waste, because some can offer value - have a look at www.seomoz.org/directories as an example of decent ones.
Andy
-
Site has been hit witha link warning.
Removing manually first off anyway. Disavow last resort from our end.
Nothing in the pipline but have noticed a lot of directories have been hit recently so I am guessing it will happen at some stage.
I am also expecting a few different views on this but would be nice to hear them. Whats your stance Andy would you kill or leave?
-
You are likely to get different feelings on this Mark.
However, are you thinking about using the disavow tool? If so, only do so if the client has been hit with a link warning. If not, and you just want to get rid of directories, then I would try and remove listings through direct contact.
As for FFA directories getting a hit in the future, I haven't seen Google state this could happen (unless I have missed something).
Andy
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Review rich snippet isn't appearing anymore
Hi, We have recently lost the review rich snippet for product that used to consistently show it for a long time. On top of that we have also lost the breadcrumb. The markup hasn't changed on these pages and gwt data tool tester doesn't show any anomalies. A few weeks back we have deployed a new property that list reviews from the first site without actually being under the same domain. Would that be an issue, review may have been considered as plagiarism somehow? Is there a way for us to confirm this theory? What are others factors that may have lead us to lose those rich snippet? Thanks
Technical SEO | | mattam0 -
Thousands of links coming from an iframe
We have an iframed calculator on one website (www.renewablesguide.co.uk) which has a text link to another of our websites (www.solarguide.co.uk) which is where the calculator originates. We allow other sites to embed the calculator which gives us the benefit of a followed link back to our site. However in the case of renewablesguide (which we own) we've added a tab to the calculator on every page which GWT shows up as 24 000 links from this site hitting the Solar Guide homepage. As the link is held within an iframe would this amount of links be seen as spammy?
Technical SEO | | holmesmedia0 -
WebMaster Tools keeps showing old 404 error but doesn't show a "Linked From" url. Why is that?
Hello Moz Community. I have a question about 404 crawl errors in WebmasterTools, a while ago we had an internal linking problem regarding some links formed in a wrong way (a loop was making links on the fly), this error was identified and fixed back then but before it was fixed google got to index lots of those malformed pages. Recently we see in our WebMaster account that some of this links still appearing as 404 but we currently don't have that issue or any internal link pointing to any of those URLs and what confuses us even more is that WebMaster doesn't show anything in the "Linked From" tab where it usually does for this type of errors, so we are wondering what this means, could be that they still in google's cache or memory? we are not really sure. If anyone has an idea of what this errors showing up now means we would really appreciate the help. Thanks. jZVh7zt.png
Technical SEO | | revimedia1 -
Quality links are beneficial, but are neutral links detrimental?
So obviously a link profile featuring quality / authoritative / relavant in-bound links is preferable, but here's my question: If I'm starting work on a brand new domain, should I build links that one would consider neutral (i.e. from a non-spammy, but unrelated site) or should I not bother and only focus on quality links? Thanks
Technical SEO | | underscorelive0 -
Google insists robots.txt is blocking... but it isn't.
I recently launched a new website. During development, I'd enabled the option in WordPress to prevent search engines from indexing the site. When the site went public (over 24 hours ago), I cleared that option. At that point, I added a specific robots.txt file that only disallowed a couple directories of files. You can view the robots.txt at http://photogeardeals.com/robots.txt Google (via Webmaster tools) is insisting that my robots.txt file contains a "Disallow: /" on line 2 and that it's preventing Google from indexing the site and preventing me from submitting a sitemap. These errors are showing both in the sitemap section of Webmaster tools as well as the Blocked URLs section. Bing's webmaster tools are able to read the site and sitemap just fine. Any idea why Google insists I'm disallowing everything even after telling it to re-fetch?
Technical SEO | | ahockley0 -
Why aren't certain links showing in SEOMOZ?
Hi, I have been trying to understand our page rank and domains that are linking to us. When I look at the list of linking domains, I see some bigger ones are missing and I don't know why. For example, we are in the Yahoo Directory with a link to trophycentral.com, but SEOMOZ is not showing the link. If SEOMOZ is not seeing it, my guess is Google is not either, which concerns me. There are several onther high page rank domains also not showing. Anyone have any idea why? Thanks! BTW, our domain is trophycentral.com
Technical SEO | | trophycentraltrophiesandawards0 -
Remove a directory using htaccess
Hi, Can someone tell me if there's a way using htaccess to say that everything in a particular directory, let's call it "A", is gone (http 410 code)? i.e. all the links should be de-indexed? Right now, I'm using the robots file to deny access. I'm not sure if it's the right thing to do since Google webmaster tools is showing me the link as indexed still and a 403 error code. Thanks.
Technical SEO | | webtarget0 -
Handling '?' in URLs.
Adios! (or something), I've noticed in my SEOMoz campaign that I am getting duplicate content warnings for URLs with extensions. For example: /login.php?action=lostpassword /login.php?action=register etc. What is the best way to deal with these type of URLs to avoid duplicate content penelties in search engines? Thanks 🙂
Technical SEO | | craigycraig0