I am cleaning up a clients link profile and am coming across a lot of directories (no surprise) My question is if an obvious fre for all generic directory doesn't look to have been hit by any updates is it a wise move recommending tit for removal?
-
I am cleaning up a clients link profile and am coming across a lot of directories (no surprise)
My question is, if an obvious free for all generic directory doesn't look to have been hit by any updates is it a wise move recommending it for removal on the basis that it is a free for all directory and could be hit in teh future?
-
I agree with Mark Scully on this one, but would like to add some thoughts:
If you are looking to clean out your backlink profile you should go about it in a very methodical fashion. I would recommend exporting the links to an Excel file and then, in a new sheet, start skimming and categorizing them -needs more research; relevant; potentially harmful; show stopper. It will be time consuming but once you have a basic categorization set you can start reaching out.
There is a real possibility that many of the directory links are from neglected and orphaned directories and that the contact e-mail may not be in operation anymore. When you find this to be the case, note it on your categorized Excel sheet. Note the date you sent the link removal request and note the response; if there is no response, note that as well. Be realistic concerning the expected reply time (this is a big deal to you; it is probably not a big deal to those hosting the directories) and send out second and third requests.
If it was me, I would concentrate on the two most harmful categories and give them a real thorough going through. After a few weeks (I know, it's a long-ish project) you should have a nice detailed actions-taken report and should feel comfortable utilizing the disavow links tool if needed.
Note: This tool, from what I understand, is not a click-and-fix and you will need to have a file of the links you would like disavowed to upload to Google for review. Barry Schwartz, over at seroundtable.com, has a nice post concerning this and he supplies an example of what a disavow report might look like:
Watch the video by Matt Cutts explaining the tool and use it with caution and only as a last resort; don't spam them with reports.
One final note: Some of these links may not be harming you as of now. Use your best judgement and ask yourself this question: "if I knew another penguin update was coming tomorrow, would having this link cause me to worry?" It isn't always a straightforward answer, but if you find yourself stretching and searching for a rational to view the link as relevant or user-centric, then it probably isn't.
I am sure there is plenty more to say on the topic, and I hope some others chime in with their thoughts. It's time to earn that paycheck.
Keep us posted, and happy digging.
-
Good point Mark that seems a much safer approach.
-
Hi Mark,
Just to clarify, the complete number of backlinks to their site is 13500? I would be quite cautious about deleting 90% of them. I'm sure some of them stand out as more toxic than others. It would be worth focusing on them first.
I know a lot of people have mixed opinions about link cleanup (whether it should be done or not) but if you managed to delete even half of the poor quality links to the site, it should be a clear enough message to Google that you're taking the warning seriously.
If a re-inclusion request fails, you could go deeper then.
-
Hi Mark
Thats kind of what I am thinking. I am going through 13500 links at the moment and it is killing me. Seeing directory after directory is very painful.
Upto now im looking at killing around 90% of the links for this particluar client as they are made up from these types of directories.
Althoughs ome of them still show very high DA and PA aswell as high TBPR in my heart I can't see how they could possibly add value to a users experience as I can't see why anybody would use them to find anything. Everybody knows that these types of directories exist for the sole purpose of obtaining links so surely it would make sense to kill the link even if it is helping at the minute?
-
Hi Mark,
I've had to do a lot of backlink analysis and removal before so this is my view.
If the directory lists links in an unnatural looking manner (i.e. just a long list with little text about the link), I would remove it. Some directories have managed to avoid any algorithm updates for now but I'm sure they will eventually get hit.
The volume of link removal you do will really depend on how large your back link profile is. I had to work through about 20,000 links which needed to be removed as they were from low quality article sites and directories. We received the unnatural link warning in GWMT and filed a re-inclusion request. This got turned down and so we had to dig even deeper into the links pointing to our site.
Just be consious of how many 'good' links you do have. If you go straight into removing a lot of directory links and leave yourself with very few 'good' links to your site, it could be an issue for you. It's really your call.
Personally, I'd remove them if the directory looks poor, has no social media presence and looks spammy.
-
Personally, if a site has been hit with a warning, then I would go through and remove everything that isn't a decent link back and I would be targeting directories as well - but this wouldn't be a complete removal - I would need to look at each first. Saying that, if I see www.greatbigdirectory4u.com, then this sort get immediate removal.
I'm not saying that every directory is a waste, because some can offer value - have a look at www.seomoz.org/directories as an example of decent ones.
Andy
-
Site has been hit witha link warning.
Removing manually first off anyway. Disavow last resort from our end.
Nothing in the pipline but have noticed a lot of directories have been hit recently so I am guessing it will happen at some stage.
I am also expecting a few different views on this but would be nice to hear them. Whats your stance Andy would you kill or leave?
-
You are likely to get different feelings on this Mark.
However, are you thinking about using the disavow tool? If so, only do so if the client has been hit with a link warning. If not, and you just want to get rid of directories, then I would try and remove listings through direct contact.
As for FFA directories getting a hit in the future, I haven't seen Google state this could happen (unless I have missed something).
Andy
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Our clients Magento 2 site has lots of obsolete categories. Advice on SEO best practice for setting server level redirects so I can delete them?
Our client's Magento website has been running for at least a decade, so has a lot of old legacy categories for Brands they no longer carry. We're looking to trim down the amount of unnecessary URL Redirects in Magento, so my question is: Is there a way that is SEO efficient to setup permanent redirects at a server level (nginx) that Google will crawl to allow us at some point to delete the categories and Magento URL Redirects? If this is a good practice can you at some point then delete the server redirects as google has marked them as permanent?
Technical SEO | | Breemcc0 -
I think I got hit by the latest Panda update
Hi everyone, I think one of my sites got hit with Panda. On Sept 18th the site dipped to "not in top 50" for almost all keywords. I checked GWT for the manual action email but my inbox is empty!!!!!!!!!! The lesser of 2 evils I guess. They had major server issues that week as well so it is hard to identify what caused the site to dip. My client has original content on the website but almost all content on the blog is copied. Do you recommend me deleting the non original content? Can the problem be elsewhere? Thanks
Technical SEO | | Carla_Dawson0 -
Specific question about pagination prompted by Adam Audette's Presentation at RKG Summit
This question is prompted by something Adam Audette said in this excellent presentation: http://www.rimmkaufman.com/blog/top-5-seo-conundrums/08062012/ First, I will lay out the issues: 1. All of our paginated pages have the same URL. To view this in action, go here: http://www.ccisolutions.com/StoreFront/category/audio-technica , scroll down to the bottom of the page and click "Next" - look at the URL. The URL is: http://www.ccisolutions.com/StoreFront/IAFDispatcher, and for every page after it, the same URL. 2. All of the paginated pages with non-unique URLs have canonical tags referencing the first page of the paginated series. 3. http://www.ccisolutions.com/StoreFront/IAFDispatcher has been instructed to be neither crawled nor indexed by Google. Now, on to what Adam said in his presentation: At about minute 24 Adam begins talking about pagination. At about 27:48 in the video, he is discussing the first of three ways to properly deal with pagination issues. He says [I am somewhat paraphrasing]: "Pages 2-N should have self-referencing canonical tags - Pages 2-N should all have their own unique URLs, titles and meta descriptions...The key is, with this is you want deeper pages to get crawled and all the products on there to get crawled too. The problem that we see a lot is, say you have ten pages, each one using rel canonical pointing back to page 1, and when that happens, the products or items on those deep pages don't get get crawled...because the rel canonical tag is sort of like a 301 and basically says 'Okay, this page is actually that page.' All the items and products on this deeper page don't get the love." Before I get to my question, I'll just throw out there that we are planning to fix the pagination issue by opting for the "View All" method, which Adam suggests as the second of three options in this video, so that fix is coming. My question is this: It seems based on what Adam said (and our current abysmal state for pagination) that the products on our paginated pages aren't being crawled or indexed. However, our products are all indexed in Google. Is this because we are submitting a sitemap? Even so, are we missing out on internal linking (authority flow) and Google love because Googlebot is finding way more products in our sitemap that what it is seeing on the site? (or missing out in other ways?) We experience a lot of volatility in our rankings where we rank extremely well for a set of products for a long time, and then disappear. Then something else will rank well for a while, and disappear. I am wondering if this issue is a major contributing factor. Oh, and did I mention that our sort feature sorts the products and imposes that new order for all subsequent visitors? it works like this: If I go to that same Audio-Technica page, and sort the 125+ resulting products by price, they will sort by price...but not just for me, for anyone who subsequently visits that page...until someone else re-sorts it some other way. So if we merchandise the order to be XYZ, and a visitor comes and sorts it ZYX and then googlebot crawls, google would potentially see entirely different products on the first page of the series than the default order marketing intended to be presented there....sigh. Additional thoughts, comments, sympathy cards and flowers most welcome. 🙂 Thanks all!
Technical SEO | | danatanseo0 -
Webmaster tools doesn't pick up 301 redirect
I had a few hundred URLs that died on my site. Google Webmaster Tools notified me about the increase in 404 errors. I fixed all of them by 301 redirecting them to the most relevant page and did multiple header checks to ensure that the 301 has been implemented correctly. Now a few weeks later, Google is giving me the exact same message in Google Webmaster Tools but they are all still 301 redirected. WTF?
Technical SEO | | DROIDSTERS0 -
Quality links are beneficial, but are neutral links detrimental?
So obviously a link profile featuring quality / authoritative / relavant in-bound links is preferable, but here's my question: If I'm starting work on a brand new domain, should I build links that one would consider neutral (i.e. from a non-spammy, but unrelated site) or should I not bother and only focus on quality links? Thanks
Technical SEO | | underscorelive0 -
Diagnostic says too many links on a page and most of the pages are from blog entries. Are tags considered links? How do I decrease links?
I just ran my first diagnostic on my site and the results came back were negative in the area of too many links one a page. There were also quite a few 404 errors. What is the best way to fix these problems? Most of the pages with too many links are from blog posts, are the tags counted as well and is this the reason for too many links?
Technical SEO | | Newport10300 -
Site: search doesn't return homepage first
When searching for site:myclient.com their homepage doesn't appear first. I know some SEOs have reported this was a warning sign that there was a penalty. Here is what I've checked/found: Toolbar pagerank remains strong. Homepage is indexed. SEO traffic is falling, but its been gradually falling for a year now, mainly due to the client neglecting any type of marketing campaigns or link building, I believe. There was not a specific drop that could be tied to a penalty. Site remains well indexed. 62,742 of 63,021 URLs in the sitemap are indexed. Site is a large ecommerce site, so many pages are duplicate content (product descriptions). Homepage does rank #1 when searching for string of text present on the homepage. Nothing unusual in Google Webmaster Tools Search for myclient.com returns homepage with 6 expanded sitelinks under it. Google safe browsing check shows no malware. Anything else I should check?
Technical SEO | | AdamThompson0 -
Internal Linking: Site-wide VS Content Links
I just watched this video in which Matt Cutts talks about the ancient 100 links per page limit. I often encounter websites which have massive navigation (elaborate main menu, side bar, footer, superfooter...etc) in addition to content area based links. My question is do you think Google passes votes (PageRank and anchor text) differently from template links such as navigation to the ones in the content area, if so have you done any testing to confirm?
Technical SEO | | Dan-Petrovic0