Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Directory and Classified Submissions
-
Are directory submissions and Classified Submissions still a good way to create backlinks?
Or they are obsolete methods and should be discontinued?
-
Thanks for the awesome comments Cyrus.
So what you suggest is going slow and develop solid, long term and genuine links by commenting to blog posts in the same niche, good quality PR submissions. good quality Article submissions and making relevant form posts?
Would you like to add something to the above list?
-
Yes, Google is smart enough.
Two years ago Google stripped toolbar PageRank from most of the profiles on SEOmoz. The links here still pass some value, but build too many of these links and you're much more likely to incur a penalty today.
This entire school of link building discussed on this page is dangerous - and filled with snake oil salesman who care nothing about harming your rankings. I've never heard of Linklecious, but it looks like they've been de-indexed by Google, so it's safe to say they were penalized.
Instead of risking burning your site to the ground with low quality links, invest your time in long-term payouts by producing good quality content and earning the links others can't earn.
-
Hi KS,
In a short answer, directory submissions have been abused over the years and we've seen a marked decrease in their effectiveness even as recently as the past 12 months.
The old rule for directory links was to build them in a 2:1 ratio. That meant for every two directory links, be sure to build at least 1 regular, high quality link. Today, the ratio is more like 1:1, or even reversed to 1:2.
If a directory is easy to get into, it's probably not worth your time. Too many of these links can lead to a drop in rankings. Done judiciously, they can give a small boost to your rankings, help round out your link profile, and help target specific anchor text phrases (again, when done in moderation)
Here's an article we published a few months back you might find helpful: http://www.seomoz.org/blog/seo-link-directory-best-practices
As for classified submissions, I'd be wary as I've never seen any evidence that they help SEO, and like low value directory links, too many "easy" links can harm your rankings.
Hope this helps. Best of luck with your SEO!
-
Thanks Herald.
Please reply to my query regarding Profile Links and Web2.0 Links later down on this page.
-
Yes you should definitely continue with the directory & Classified submission. It will help you a lot. If you have any query then surely concerned with me.
Thanks
-
What about Profile Links and Web2.0 Links. They say its good to create 100+ profile links every month with appropriate keywords (while SEOing a website). Some say Profile links are better than Form Links or Blog Comments.
If I use services to create them, most of the links turn out to be on not-so-good Websites. But they say at the end of the day its about Backlinks.
My question is: Isn't Google smart enough to detect such practices?
In other word: Do profile links really help?
-
Hmmm useful tips. Thanks.
How about submitting those pages (URLs) to Linklecious or Pingomatic so they are crawled by Google?
Would that help?
-
If they follow then you could also check:
-
whether the directory actually has some ranking already and has been around for a while
-
how many links they usually put on one page before paging to the next one
-
check if the listing pages of the recently added entries (usually last ones) are already in Google index - I normally do it by checking their PageRank - if it's at least white (I use Quirk SearchStatus plugin for Firefox) that means Google is already aware of them - which indicates that they crawl this directory pretty frequently for new content / pages
-
try to perform a search on Google for a specific keyword - the category you want to submit your link to and see what's their position for it
Obviously you would do all this if you had a lot of time to go through each directory separately, but it might be worth wile if you're planning to get some links in the different way then the generic links from the visitors.
-
-
Yup we always do make sure they are DO FOLLOW classified or directory sites.
-
Most of the submissions will give you a link with attribute rel set to nofollow - and these don't really give you any SEO benefit, so it is quite important to first check the other listings and see whether they have this attribute and value assigned to the anchor - if so, then the only benefit you will get is the visit if someone actually clicks on the link in the listing and gets to your site.
-
Thanks for the help. So basically we should continue with them right?
-
Hi KS_,
As you know that Directory submission mostly used by all the business companies for the promotion of their websites or products.
The directory & classified submissions are mainly used for getting listing in all major & useful directories.
The both above submissions are used according to the target area or population.Classified submissions are similar to yellow pages.The most important thing to remember while doing listing is that it should be of same niche where the listing is to be done. This will leads to indexed pages & rank higher in search engines & it will helps to gain visibility via search engines.
The above both submission helps to increase in sales of products, website presence specially in search engines.These will helps to gain non reciprocal back links from high Page Rank Websites/ Directories.
I hope that your query had been solved.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What are best page titles for sub-folders or sub-directories? Same as website?
Hi all, We always mention "brand & keyword" in every page title along with topic in the website, like "Topic | vertigo tiles". Let's say there is a sub-directory with hundreds of pages...what will be the best page title practice in mentioning "brand & keyword" across all pages of sub-directory to benefit in-terms if SEO? Can we add "vertigo tiles" to all pages of sub-directory? Or we must not give same phrase? Thanks,
Intermediate & Advanced SEO | | vtmoz0 -
How to rank if you are an aggregator or a directory of resource?
Most of the SEO suggestions (great quality content, long form content, engagement rate/time on the page, authority inbound links ) apply to content oriented site. But what should you do if you are an aggregator or a resource directory? You aim is to send the user faster to other site they are looking for or provide ranking about the resources. In fact at a very basic level you are competing for search engine traffic because they are doing same things. You may have done a hand crafted, human created resource that is better than what algorithms are showing. And your site likely to have lot more outgoing links than content. You know you are better (or getting better) since repeat visitors keep coming back. So in these days of Search engines, what a resource directory or aggregator site do to rank? Because even directories need first time visitors till they start coming back again.
Intermediate & Advanced SEO | | Maayboli0 -
Keywords in URL: sub-directory or single layer keywords?
Hi guys, im putting together a proposal for a new site and trying to figure out if it'd be better to (A) have a keyword split across multiple directories or duplicate keywords to have the keyword hyphenated? For example, for the topic of "Christmas decor" would you use; (A) - www.domain.com/Christmas/Decor (B) - www.domain.com/Christmas/Christmas-Decor in example B the phrase 'Christmas' is duplicated which looks a little spammy, but the key term "Christmas decor" is in the URL without being broken up by directories. which is stronger? Any advice welcome! Thanks guys!
Intermediate & Advanced SEO | | JAR8971 -
How to outrank a directory listing with high DA but low PA?
My site is at 4th place, 3 places above it is a gumtree (similar to yell, yelp) listing. How can you figure out how difficult it would be outrank those pages? I mean obviously the pages would have low PA and they are top based on the high DA of the site. This also seems to go back to keyword research and difficulty, when I'm doing keyword research and I see a wikipedia site in top 5 rank, or a yell.com or perhaps an article in forbes.com outranks your site. Typically the problem seems to be Google giving a lot of credit to these pages rankings based on the high DA rather than PA of the pages. How would you gauge the difficulty of that keyword then if the competition are pages with very high DA which is impossible to compete with but low PA? Thanks
Intermediate & Advanced SEO | | magusara2 -
Robots.txt: how to exclude sub-directories correctly?
Hello here, I am trying to figure out the correct way to tell SEs to crawls this: http://www.mysite.com/directory/ But not this: http://www.mysite.com/directory/sub-directory/ or this: http://www.mysite.com/directory/sub-directory2/sub-directory/... But with the fact I have thousands of sub-directories with almost infinite combinations, I can't put the following definitions in a manageable way: disallow: /directory/sub-directory/ disallow: /directory/sub-directory2/ disallow: /directory/sub-directory/sub-directory/ disallow: /directory/sub-directory2/subdirectory/ etc... I would end up having thousands of definitions to disallow all the possible sub-directory combinations. So, is the following way a correct, better and shorter way to define what I want above: allow: /directory/$ disallow: /directory/* Would the above work? Any thoughts are very welcome! Thank you in advance. Best, Fab.
Intermediate & Advanced SEO | | fablau1 -
How is Google crawling and indexing this directory listing?
We have three Directory Listing pages that are being indexed by Google: http://www.ccisolutions.com/StoreFront/jsp/ http://www.ccisolutions.com/StoreFront/jsp/html/ http://www.ccisolutions.com/StoreFront/jsp/pdf/ How and why is Googlebot crawling and indexing these pages? Nothing else links to them (although the /jsp.html/ and /jsp/pdf/ both link back to /jsp/). They aren't disallowed in our robots.txt file and I understand that this could be why. If we add them to our robots.txt file and disallow, will this prevent Googlebot from crawling and indexing those Directory Listing pages without prohibiting them from crawling and indexing the content that resides there which is used to populate pages on our site? Having these pages indexed in Google is causing a myriad of issues, not the least of which is duplicate content. For example, this file <tt>CCI-SALES-STAFF.HTML</tt> (which appears on this Directory Listing referenced above - http://www.ccisolutions.com/StoreFront/jsp/html/) clicks through to this Web page: http://www.ccisolutions.com/StoreFront/jsp/html/CCI-SALES-STAFF.HTML This page is indexed in Google and we don't want it to be. But so is the actual page where we intended the content contained in that file to display: http://www.ccisolutions.com/StoreFront/category/meet-our-sales-staff As you can see, this results in duplicate content problems. Is there a way to disallow Googlebot from crawling that Directory Listing page, and, provided that we have this URL in our sitemap: http://www.ccisolutions.com/StoreFront/category/meet-our-sales-staff, solve the duplicate content issue as a result? For example: Disallow: /StoreFront/jsp/ Disallow: /StoreFront/jsp/html/ Disallow: /StoreFront/jsp/pdf/ Can we do this without risking blocking Googlebot from content we do want crawled and indexed? Many thanks in advance for any and all help on this one!
Intermediate & Advanced SEO | | danatanseo0 -
Archiving a festival website - subdomain or directory?
Hi guys I look after a festival website whose program changes year in and year out. There are a handful of mainstay events in the festival which remain each year, but there are a bunch of other events which change each year around the mainstay programming.This often results in us redoing the website each year (a frustrating experience indeed!) We don't archive our past festivals online, but I'd like to start doing so for a number of reasons 1. These past festivals have historical value - they happened, and they contribute to telling the story of the festival over the years. They can also be used as useful windows into the upcoming festival. 2. The old events (while no longer running) often get many social shares, high quality links and in some instances still drive traffic. We try out best to 301 redirect these high value pages to the new festival website, but it's not always possible to find a similar alternative (so these redirects often go to the homepage) Anyway, I've noticed some festivals archive their content into a subdirectory - i.e. www.event.com/2012 However, I'm thinking it would actually be easier for my team to archive via a subdomain like 2012.event.com - and always use the www.event.com URL for the current year's event. I'm thinking universally redirecting the content would be easier, as would cloning the site / database etc. My question is - is one approach (i.e. directory vs. subdomain) better than the other? Do I need to be mindful of using a subdomain for archival purposes? Hope this all makes sense. Many thanks!
Intermediate & Advanced SEO | | cos20300 -
Merging Domains... Sub-domains, Directories or Seperate Sites?
Hello! I am hoping you can help me decide the best path to take here... A little background: I'm moving to a new company that has three old domains (the oldest is 10 years old), which get a lot of traffic from their e-letters. Until recently they have not cared about SEO. So the websites have some structural, coding, URL and other issues. The sites are indexed, but have a problem getting crawled and/or indexed for new content - haven't delved into this yet but am certain I will be able to fix any of these issues. These three domains are PR4, PR4, PR5 and contain hundreds of unique articles. Here's the question... They want to move these three sites **to their main company site (PR4) and create sub domains for each one. ** I am wondering if this is a good idea or not. I have merged sites before (creating categories and/or directories) and the end result is that the ONE big site, is much for effective than TWO smaller, less authoritative sites. But the sub domain idea is something I am unsure about from an SEO perspective. Should we do this with sub domains? Or do you think we should keep the sites separate? How do Panda and Penguin play into this? Thanks in advance for the help! SD P.S. I'm not a huge advocate in using PR as a measurement tool, but since I can't reveal the actual domains, I figured I would list it as a reference point.
Intermediate & Advanced SEO | | essdee0