I am cleaning up a clients link profile and am coming across a lot of directories (no surprise) My question is if an obvious fre for all generic directory doesn't look to have been hit by any updates is it a wise move recommending tit for removal?
-
I am cleaning up a clients link profile and am coming across a lot of directories (no surprise)
My question is, if an obvious free for all generic directory doesn't look to have been hit by any updates is it a wise move recommending it for removal on the basis that it is a free for all directory and could be hit in teh future?
-
I agree with Mark Scully on this one, but would like to add some thoughts:
If you are looking to clean out your backlink profile you should go about it in a very methodical fashion. I would recommend exporting the links to an Excel file and then, in a new sheet, start skimming and categorizing them -needs more research; relevant; potentially harmful; show stopper. It will be time consuming but once you have a basic categorization set you can start reaching out.
There is a real possibility that many of the directory links are from neglected and orphaned directories and that the contact e-mail may not be in operation anymore. When you find this to be the case, note it on your categorized Excel sheet. Note the date you sent the link removal request and note the response; if there is no response, note that as well. Be realistic concerning the expected reply time (this is a big deal to you; it is probably not a big deal to those hosting the directories) and send out second and third requests.
If it was me, I would concentrate on the two most harmful categories and give them a real thorough going through. After a few weeks (I know, it's a long-ish project) you should have a nice detailed actions-taken report and should feel comfortable utilizing the disavow links tool if needed.
Note: This tool, from what I understand, is not a click-and-fix and you will need to have a file of the links you would like disavowed to upload to Google for review. Barry Schwartz, over at seroundtable.com, has a nice post concerning this and he supplies an example of what a disavow report might look like:
Watch the video by Matt Cutts explaining the tool and use it with caution and only as a last resort; don't spam them with reports.
One final note: Some of these links may not be harming you as of now. Use your best judgement and ask yourself this question: "if I knew another penguin update was coming tomorrow, would having this link cause me to worry?" It isn't always a straightforward answer, but if you find yourself stretching and searching for a rational to view the link as relevant or user-centric, then it probably isn't.
I am sure there is plenty more to say on the topic, and I hope some others chime in with their thoughts. It's time to earn that paycheck.
Keep us posted, and happy digging.
-
Good point Mark that seems a much safer approach.
-
Hi Mark,
Just to clarify, the complete number of backlinks to their site is 13500? I would be quite cautious about deleting 90% of them. I'm sure some of them stand out as more toxic than others. It would be worth focusing on them first.
I know a lot of people have mixed opinions about link cleanup (whether it should be done or not) but if you managed to delete even half of the poor quality links to the site, it should be a clear enough message to Google that you're taking the warning seriously.
If a re-inclusion request fails, you could go deeper then.
-
Hi Mark
Thats kind of what I am thinking. I am going through 13500 links at the moment and it is killing me. Seeing directory after directory is very painful.
Upto now im looking at killing around 90% of the links for this particluar client as they are made up from these types of directories.
Althoughs ome of them still show very high DA and PA aswell as high TBPR in my heart I can't see how they could possibly add value to a users experience as I can't see why anybody would use them to find anything. Everybody knows that these types of directories exist for the sole purpose of obtaining links so surely it would make sense to kill the link even if it is helping at the minute?
-
Hi Mark,
I've had to do a lot of backlink analysis and removal before so this is my view.
If the directory lists links in an unnatural looking manner (i.e. just a long list with little text about the link), I would remove it. Some directories have managed to avoid any algorithm updates for now but I'm sure they will eventually get hit.
The volume of link removal you do will really depend on how large your back link profile is. I had to work through about 20,000 links which needed to be removed as they were from low quality article sites and directories. We received the unnatural link warning in GWMT and filed a re-inclusion request. This got turned down and so we had to dig even deeper into the links pointing to our site.
Just be consious of how many 'good' links you do have. If you go straight into removing a lot of directory links and leave yourself with very few 'good' links to your site, it could be an issue for you. It's really your call.
Personally, I'd remove them if the directory looks poor, has no social media presence and looks spammy.
-
Personally, if a site has been hit with a warning, then I would go through and remove everything that isn't a decent link back and I would be targeting directories as well - but this wouldn't be a complete removal - I would need to look at each first. Saying that, if I see www.greatbigdirectory4u.com, then this sort get immediate removal.
I'm not saying that every directory is a waste, because some can offer value - have a look at www.seomoz.org/directories as an example of decent ones.
Andy
-
Site has been hit witha link warning.
Removing manually first off anyway. Disavow last resort from our end.
Nothing in the pipline but have noticed a lot of directories have been hit recently so I am guessing it will happen at some stage.
I am also expecting a few different views on this but would be nice to hear them. Whats your stance Andy would you kill or leave?
-
You are likely to get different feelings on this Mark.
However, are you thinking about using the disavow tool? If so, only do so if the client has been hit with a link warning. If not, and you just want to get rid of directories, then I would try and remove listings through direct contact.
As for FFA directories getting a hit in the future, I haven't seen Google state this could happen (unless I have missed something).
Andy
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
My Homepage Won't Load if Javascript is Disabled. Is this an SEO/Indexation issue?
Hi everyone, I'm working with a client who recently had their site redesigned. I'm just going through to do an initial audit to make sure everything looks good. Part of my initial indexation audit goes through questions about how the site functions when you disable, javascript, cookies, and/or css. I use the Web Developer extension for Chrome to do this. I know, more recently, people have said that content loaded by Javascript will be indexed. I just want to make sure it's not hurting my clients SEO. http://americasinstantsigns.com/ Is it as simple as looking at Google's Cached URL? The URL is definitely being indexed and when looking at the text-only version everything appears to be in order. This may be an outdated question, but I just want to be sure! Thank you so much!
Technical SEO | | ccox10 -
Why isn't my homepage number #1 when searching my brand name?
Hi! So we recently (a month ago) lunched a new website, we have great content that updates everyday, we're active on social platforms, and we did all that's possible, at the moment, when it comes to on site optimization (a web developer will join our team this month and help us fix all the rest). When I search for our brand name all our social profiles come up first, after them we have a few inner pages from our different news sections, but our homepage is somewhere in the 2nd search page... What may be the reason for that? Is it just a matter of time or is there a problem with our homepage I'm unable to find? Thanks!
Technical SEO | | Orly-PP0 -
Another client copies everything to blogspot. Is that what keeps her site from ranking? Or what? Appears to be a penalty somewhere but can't find it.
This client has a brand new site: http://www.susannoyesandersonpoems.com Her previous site was really bad for SEO, yet at one time she actually ranked on the first page for "LDS poems." She came to me because she lost rank. I checked things out and found some shoddy SEO work by a very popular Wordpress webhoste that I will leave unnamed. If you do a backlink analysis you can see the articles and backlinks they created. But there are so few, so I'm not sure if that was it, or it just was because of the fact that her site was so poorly optimized and Google made a change, and down she fell. Here's the only page she had on the LDS poems topic in her old site: https://web.archive.org/web/20130820161529/http://susannoyesandersonpoems.com/category/lds-poetry/ Even the links in the nav were bad as they were all images. And that ranked in position 2 I think she said. Even with her new site, she continues to decline. In fact she is nowhere to be found for main keywords making me think there is a penalty. To try and build rank for categories, I'm allowing google to index the category landing pages and had her write category descriptions that included keywords. We are also listing the categories on the left and linking to those category pages. Maybe those pages are watered down by the poem excerpts?? Here's an example of a page we want to rank: http://susannoyesandersonpoems.com/category/lds-poetry/ Any help from the peanut gallery?
Technical SEO | | katandmouse0 -
Specific question about pagination prompted by Adam Audette's Presentation at RKG Summit
This question is prompted by something Adam Audette said in this excellent presentation: http://www.rimmkaufman.com/blog/top-5-seo-conundrums/08062012/ First, I will lay out the issues: 1. All of our paginated pages have the same URL. To view this in action, go here: http://www.ccisolutions.com/StoreFront/category/audio-technica , scroll down to the bottom of the page and click "Next" - look at the URL. The URL is: http://www.ccisolutions.com/StoreFront/IAFDispatcher, and for every page after it, the same URL. 2. All of the paginated pages with non-unique URLs have canonical tags referencing the first page of the paginated series. 3. http://www.ccisolutions.com/StoreFront/IAFDispatcher has been instructed to be neither crawled nor indexed by Google. Now, on to what Adam said in his presentation: At about minute 24 Adam begins talking about pagination. At about 27:48 in the video, he is discussing the first of three ways to properly deal with pagination issues. He says [I am somewhat paraphrasing]: "Pages 2-N should have self-referencing canonical tags - Pages 2-N should all have their own unique URLs, titles and meta descriptions...The key is, with this is you want deeper pages to get crawled and all the products on there to get crawled too. The problem that we see a lot is, say you have ten pages, each one using rel canonical pointing back to page 1, and when that happens, the products or items on those deep pages don't get get crawled...because the rel canonical tag is sort of like a 301 and basically says 'Okay, this page is actually that page.' All the items and products on this deeper page don't get the love." Before I get to my question, I'll just throw out there that we are planning to fix the pagination issue by opting for the "View All" method, which Adam suggests as the second of three options in this video, so that fix is coming. My question is this: It seems based on what Adam said (and our current abysmal state for pagination) that the products on our paginated pages aren't being crawled or indexed. However, our products are all indexed in Google. Is this because we are submitting a sitemap? Even so, are we missing out on internal linking (authority flow) and Google love because Googlebot is finding way more products in our sitemap that what it is seeing on the site? (or missing out in other ways?) We experience a lot of volatility in our rankings where we rank extremely well for a set of products for a long time, and then disappear. Then something else will rank well for a while, and disappear. I am wondering if this issue is a major contributing factor. Oh, and did I mention that our sort feature sorts the products and imposes that new order for all subsequent visitors? it works like this: If I go to that same Audio-Technica page, and sort the 125+ resulting products by price, they will sort by price...but not just for me, for anyone who subsequently visits that page...until someone else re-sorts it some other way. So if we merchandise the order to be XYZ, and a visitor comes and sorts it ZYX and then googlebot crawls, google would potentially see entirely different products on the first page of the series than the default order marketing intended to be presented there....sigh. Additional thoughts, comments, sympathy cards and flowers most welcome. 🙂 Thanks all!
Technical SEO | | danatanseo0 -
Should 301-ed links be removed from sitemap?
In an effort to do some housekeeping on our site we are wanting to change the URL format for a couple thousand links on our site. Those links will all been 301 redirected to corresponding links in the new URL format. For example, old URL format: /tag/flowers as well as search/flowerswill be 301-ed to, new URL format: /content/flowers**Question:**Since the old links also exist in our sitemap, should we add the new links to our sitemap in addition to the old links, or replace the old links with new ones in our sitemap? Just want to make sure we don’t lose the ranking we currently have for the old links.Any help would be appreciated. Thanks!
Technical SEO | | shawn811 -
No confirmation page on Google's Disavow links tool?
I've been going through and doing some spring cleaning on some spammy links to my site. I used Google's Disavow links tool, but after I submit my text file, nothing happens. Should I be getting some sort of confirmation page? After I upload my file, I don't get any notifications telling me Google has received my file or anything like that. It just takes me back to this page: http://cl.ly/image/0S320q46321R/Image 2013-04-26 at 11.15.25 AM.png Am I doing something wrong or is this what everyone else is seeing too?
Technical SEO | | shawn810 -
Forum Profile Links
Are they really important? Many preach they are, and there are tonnes of services out there who give you thousands of forum profile links in no time. I strictly believe in genuine links built the hard way, and definitely don't want to get into anything which is black hat. Please suggest if building several Forum Profile Links is an appropriate way of building links?
Technical SEO | | KS__2 -
How Best to Handle 'Site Jacking' (Unauthorized Use of Someone else's Dedicated IP Address)
Anyone can point their domain to any IP address they want. I've found at least two domains (same owner) with two totally unrelated domains (to each other and to us) that are currently pointing their domains to our IP address. The IP address is on our dedicated server (we control the entire physical server) and is exclusive to only that one domain (so it isn't a virtual hosting misconfiguration issue) This has caused Google to index their two domains with duplicate content from our site (found by searching for site:www.theirdomain.com) Their site does not come up in the first 50 results though for any of the keywords we come up for so Google obviously knows THEY are the dupe content, not us (our site has been around for 12 years - much longer than them.) Their registration is private and we have not been able to contact these people. I'm not sure if this is just a mistake on the DNS for the two domains or it is someone doing this intentionally to try to harm our ranking. It has been going on for a while, so it is most likely not a mistake for two live sites as they would have noticed long ago they were pointing to the wrong IP. I can think of a variety of actions to take but I can find no information anywhere regarding what Google officially recommends doing in this situation, assuming you can't get a response. Here's my ideas. a) Approach it as a Digital Copyright Violation and go through the lengthy process of having their site taken down. Pro: Eliminates the issue. Con: Sort of a pain and we could be leaving possibly some link juice on the table? b) Modify .htaccess to do a 301 redirect from any URL not using our domain, to our domain. This means Google is going to see several domains all pointing to the same IP and all except our domain, 301 redirecting to our domain. Not sure if THAT will harm (or help) us? Would we not receive link juice then from any site out there that was linking to these other domains? Con: Google will see the context of the backlinks and their link text will not be related at all to our site. In addition, if any of these other domains pointing to our IP have backlinks from 'bad neighborhoods' I assume it could hurt us? c) Modify .htaccess to do a 404 File Not Found or 403 forbidden error? I posted in other forums and have gotten suggestions that are all over the map. In many cases the posters don't even understand what I'm talking about - thinking they are just normal backlinks. Argh! So I'm taking this to "The Experts" on SEOMoz.
Technical SEO | | jcrist1