Competitors and Directory Links
-
Hi guys, wanted to get some input and thoughts here. I'm analyzing many competitor links for a specific client (even other clients actually as well) and come across a pretty heavy directory backlink profiles.
has anyone here had success with directory listings? Seem many of the competitors backlinks are coming from directories.
What say you?
-
Hi Ryan.
Glad to hear someone talking sense! In researching sites' linking patterns I have also often come across sites that are linked to that should really have no logical relationship back to the site in question, yet the link still counts. Sometimes these are from blog sites that seo companies have set up themselves, that have little content, and not terribly interesting content either, yet this still seems to work in 2011, which is baffling.
I agree that one day this will all be devalued, but you wonder when.
-
@Doctone, you are absolutely correct. The results we see show directory links are a factor and work. I also see a lot of links from high DA/PA asian blogs and other sites providing followed links to sites and topics they have no logical relationship towards.
These type of links have existed for a long time, and do offer benefits to the sites on the receiving end.
With the above noted, it seems clear that if it is obvious to us, then they are also obvious to Google. It's only a matter of time until these links are penalized in the form of another Panda-type update. I would suggest not crossing the line and instead focusing on adding value to your users, adding better content users will want to naturally link to, and then you will be standing tall after your competitors get hit with the update that kills these links.
-
Hi there
I recently approached 6 different seo companies to give me a proposal to improve our rankings. Some were "cheaper", some were very expensive and well know here in the Uk. Guess what? They all still suggest adding directory links as part of a linking strategy, in fact one of them would have added 100 per month!
Looking at our competition I sadly have conclude that they are correct. We all hold google in high esteem and think it's extremely clever (and probably we're all probably a bit afraid of google, as none of us want to be penalised), but from my reasonably extensive own research using seo site explorer, I have to conclude that using directory links still really works, provided said directories are in good health, if you know what I mean, but you can easily tell by using a PR tool as well as the siteexplorer numbers.
-
With respect to blogs, you can use those comments along with forum posts for link value. However, you should check after the first time you use this method. Many blogs have their comments set up to automatically apply the nofollow attribute.
For link directories, I would recommend using the free directories. ODP is great but takes a very long time to get into. Many other directories wish to exchange links which isn't the best idea for many sites. The links shouldn't hurt but I would suggest not paying for a listing in the directory unless you have a high value site and the directory is of high value as well such as Yahoo's.
-
I think you're already on the right track. I use the Keyword Difficulty Tool and get a list of top ranking sites. Then I put those sites into Open Site Explorer and analyze their link profiles to get prospective link opportunities.
-
Good response guys, thanks
@ Dave: What research do you use to look into the directories? It seems most of these directories where you may submit one too may submit to many others. I've always stayed away from directories to make the link portfolio as organic as possible but coming across an incredible amount of backlink porfolios with directory links.
@Becampaz: Yes, I see blog comments do serve well in some of these backlink ports.
-
Directory Links most certainly still work. If you use this strategy make it a part of your overall link-building campaign but not the be all and end all. Also check the value and relevance of the directories as they may be subject to scrutiny by Google in the future. Good research here will serve you well. Make sure the page you're listing will potentially go on is strong and not loaded with other links as the power gets diluted with the number of links per page.
I like to call it building a diversified link portfolio using many tactics to guard against any major hiccups with a change in the algorithms of the Search Engines. Check out the webinars under the Learn SEO tab and the SEOMoz blog under the Community tab for great detailed info on tactics.
-
Well, lately all competitor sites that i have been looking seems to get the majority of their links from mainly two sources: Directories and Blog Comments.
They rank well (position 1 to 5) for medium (sometimes high) competitive keywords, so I would say that this works and is something easy to execute. However, this is not probably the best way to achieve an high authority for your site.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How would you link build to this page?
Hi Guys, I'm looking to build links to a commercial page similar to this: https://apolloblinds.com.au/venetian-blinds/ How would you even create quality links (not against Google TOS) to a commercial page like that? Any ideas would be very much appreciated. Cheers.
Intermediate & Advanced SEO | | spyaccounts140 -
Internal Links - Different URLs
Hey so, In my product page, I have recommended products at the bottom. The issue is that those recommended products have long parameters such as sitename.com/product-xy-z/https%3A%2F%2Fwww.google.co&srcType=dp_recs The reason why it has that long parameter is due to tracking purposes (internally with the dev and UX team). My question is, should I replace it with the clean URL or as long as it has the canonical tag, it should be okay to have such a long parameter? I would think clean URL would help with internal links and what not...but if it already has a canonical tag would it help? Another issue is that the URL is different and not just the parameter. For instance..the canonical URL is sitename.com/productname-xyz/ and so the internal link used on the product page (same exact page just different URL with parameter) sitename.com/xyz/https%3A%2F%2Fwww.google.co&srcType=dp_recs (missing product name), BUT still has the canonical tag!
Intermediate & Advanced SEO | | ggpaul5620 -
Spammy Inbound Links
Hello, We have been using Zendesk to manage our customer support tickets for approx 2 years. We recently noticed that the attached forum had lot's of spam comments attached to it. Promoting Viagra and the like. The system was installed as a subdomain of my site support.mysite.com We have since deleted our account with Zendesk but Moz and Google are reporting loads of inbound links to that subdomain that are all total spam with Viagra in the anchor text etc. The subdomain no longer exists and now throws a 404. Can these links still hurt me? Is there other steps I need to take? I have disavowed all the links.
Intermediate & Advanced SEO | | niallfred0 -
Software to Analyse Bad Links
Is it possible to get a pretty good idea of my site's link profile by merging the link data from Google Webmaster Tools, MOZ and SEMRUSH (backlinks). I would think that combining the linking domains from these three packages would create a pretty good profile. Once I create a list of domains that link to my site is it possible to run them thru MOZ so as to evaluate their quality? Last year I paid a reputable SEO firm to run a link analysis, process link removal requests and finally a disavow, only to see my domain authority decline from 33 to 24. So I am leary of the process. That being said I have reviewed the disavow file that was submitted last year and still see about a third of the low quality domains still linking to our site. Alternatively is it worthwhile to run a link detox report. Maybe it is worth biting the bullet and spending the $175.00 dollars to run a report. Our site (www.nyc-officespace-leader.com) does not have too many links so maybe I can research this manually. Thoughts???
Intermediate & Advanced SEO | | Kingalan10 -
To recover from Penguin update, shall i remove the links or disavow links?
Hi, One of our websites hit by Penguin update and I now know where the links are coming from. I have chance to remove the links from those incoming links but I am a little confused whether i should just remove the links from incoming links or disavow the links? Thanks
Intermediate & Advanced SEO | | Rubix0 -
Webmaster Tools Internal Links
Hi all, I have around 400 links in the navigation menu (site-wide) and when I use webmaster tools to check for internal links to each page; some have as many as 250K and some as little as 200. Shouldn't the number of internal links for pages found in the navigation menu be relatively the same? Or is Google registering more internal links for pages linked closer to the top of the code Thanks!
Intermediate & Advanced SEO | | Carlos-R0 -
External links from banned websites
Currently working with a client that has seen his rankings diminish after the penguin update. I've manually analyzed all his 600 backlinks and identified approximately 85 external links from websites that have been banned by Google. How do these sites affect his current rankings? Should i just disavow all these links using the Google disavow tool? Any comments would be highly appreciated!
Intermediate & Advanced SEO | | Nick_Johansson0 -
Foreign Language Directories
I have a client whose site has each page in multiple languages. each is in specific directories. Needless to say each page is showing up with the same site title, meta data, and content. When my campaigns are crawled they show up as thousands of page errors. Should i add each of these into robots.txt? would this fix the issue of duplicate content?
Intermediate & Advanced SEO | | gkellyiii0