Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Directory and Classified Submissions
-
Are directory submissions and Classified Submissions still a good way to create backlinks?
Or they are obsolete methods and should be discontinued?
-
Thanks for the awesome comments Cyrus.
So what you suggest is going slow and develop solid, long term and genuine links by commenting to blog posts in the same niche, good quality PR submissions. good quality Article submissions and making relevant form posts?
Would you like to add something to the above list?
-
Yes, Google is smart enough.
Two years ago Google stripped toolbar PageRank from most of the profiles on SEOmoz. The links here still pass some value, but build too many of these links and you're much more likely to incur a penalty today.
This entire school of link building discussed on this page is dangerous - and filled with snake oil salesman who care nothing about harming your rankings. I've never heard of Linklecious, but it looks like they've been de-indexed by Google, so it's safe to say they were penalized.
Instead of risking burning your site to the ground with low quality links, invest your time in long-term payouts by producing good quality content and earning the links others can't earn.
-
Hi KS,
In a short answer, directory submissions have been abused over the years and we've seen a marked decrease in their effectiveness even as recently as the past 12 months.
The old rule for directory links was to build them in a 2:1 ratio. That meant for every two directory links, be sure to build at least 1 regular, high quality link. Today, the ratio is more like 1:1, or even reversed to 1:2.
If a directory is easy to get into, it's probably not worth your time. Too many of these links can lead to a drop in rankings. Done judiciously, they can give a small boost to your rankings, help round out your link profile, and help target specific anchor text phrases (again, when done in moderation)
Here's an article we published a few months back you might find helpful: http://www.seomoz.org/blog/seo-link-directory-best-practices
As for classified submissions, I'd be wary as I've never seen any evidence that they help SEO, and like low value directory links, too many "easy" links can harm your rankings.
Hope this helps. Best of luck with your SEO!
-
Thanks Herald.
Please reply to my query regarding Profile Links and Web2.0 Links later down on this page.
-
Yes you should definitely continue with the directory & Classified submission. It will help you a lot. If you have any query then surely concerned with me.
Thanks
-
What about Profile Links and Web2.0 Links. They say its good to create 100+ profile links every month with appropriate keywords (while SEOing a website). Some say Profile links are better than Form Links or Blog Comments.
If I use services to create them, most of the links turn out to be on not-so-good Websites. But they say at the end of the day its about Backlinks.
My question is: Isn't Google smart enough to detect such practices?
In other word: Do profile links really help?
-
Hmmm useful tips. Thanks.
How about submitting those pages (URLs) to Linklecious or Pingomatic so they are crawled by Google?
Would that help?
-
If they follow then you could also check:
-
whether the directory actually has some ranking already and has been around for a while
-
how many links they usually put on one page before paging to the next one
-
check if the listing pages of the recently added entries (usually last ones) are already in Google index - I normally do it by checking their PageRank - if it's at least white (I use Quirk SearchStatus plugin for Firefox) that means Google is already aware of them - which indicates that they crawl this directory pretty frequently for new content / pages
-
try to perform a search on Google for a specific keyword - the category you want to submit your link to and see what's their position for it
Obviously you would do all this if you had a lot of time to go through each directory separately, but it might be worth wile if you're planning to get some links in the different way then the generic links from the visitors.
-
-
Yup we always do make sure they are DO FOLLOW classified or directory sites.
-
Most of the submissions will give you a link with attribute rel set to nofollow - and these don't really give you any SEO benefit, so it is quite important to first check the other listings and see whether they have this attribute and value assigned to the anchor - if so, then the only benefit you will get is the visit if someone actually clicks on the link in the listing and gets to your site.
-
Thanks for the help. So basically we should continue with them right?
-
Hi KS_,
As you know that Directory submission mostly used by all the business companies for the promotion of their websites or products.
The directory & classified submissions are mainly used for getting listing in all major & useful directories.
The both above submissions are used according to the target area or population.Classified submissions are similar to yellow pages.The most important thing to remember while doing listing is that it should be of same niche where the listing is to be done. This will leads to indexed pages & rank higher in search engines & it will helps to gain visibility via search engines.
The above both submission helps to increase in sales of products, website presence specially in search engines.These will helps to gain non reciprocal back links from high Page Rank Websites/ Directories.
I hope that your query had been solved.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to rank if you are an aggregator or a directory of resource?
Most of the SEO suggestions (great quality content, long form content, engagement rate/time on the page, authority inbound links ) apply to content oriented site. But what should you do if you are an aggregator or a resource directory? You aim is to send the user faster to other site they are looking for or provide ranking about the resources. In fact at a very basic level you are competing for search engine traffic because they are doing same things. You may have done a hand crafted, human created resource that is better than what algorithms are showing. And your site likely to have lot more outgoing links than content. You know you are better (or getting better) since repeat visitors keep coming back. So in these days of Search engines, what a resource directory or aggregator site do to rank? Because even directories need first time visitors till they start coming back again.
Intermediate & Advanced SEO | | Maayboli0 -
Directory with Duplicate content? what to do?
Moz keeps finding loads of pages with duplicate content on my website. The problem is its a directory page to different locations. E.g if we were a clothes shop we would be listing our locations: www.sitename.com/locations/london www.sitename.com/locations/rome www.sitename.com/locations/germany The content on these pages is all the same, except for an embedded google map that shows the location of the place. The problem is that google thinks all these pages are duplicated content. Should i set a canonical link on every single page saying that www.sitename.com/locations/london is the main page? I don't know if i can use canonical links because the page content isn't identical because of the embedded map. Help would be appreciated. Thanks.
Intermediate & Advanced SEO | | nchlondon0 -
How to outrank a directory listing with high DA but low PA?
My site is at 4th place, 3 places above it is a gumtree (similar to yell, yelp) listing. How can you figure out how difficult it would be outrank those pages? I mean obviously the pages would have low PA and they are top based on the high DA of the site. This also seems to go back to keyword research and difficulty, when I'm doing keyword research and I see a wikipedia site in top 5 rank, or a yell.com or perhaps an article in forbes.com outranks your site. Typically the problem seems to be Google giving a lot of credit to these pages rankings based on the high DA rather than PA of the pages. How would you gauge the difficulty of that keyword then if the competition are pages with very high DA which is impossible to compete with but low PA? Thanks
Intermediate & Advanced SEO | | magusara2 -
Robots.txt: how to exclude sub-directories correctly?
Hello here, I am trying to figure out the correct way to tell SEs to crawls this: http://www.mysite.com/directory/ But not this: http://www.mysite.com/directory/sub-directory/ or this: http://www.mysite.com/directory/sub-directory2/sub-directory/... But with the fact I have thousands of sub-directories with almost infinite combinations, I can't put the following definitions in a manageable way: disallow: /directory/sub-directory/ disallow: /directory/sub-directory2/ disallow: /directory/sub-directory/sub-directory/ disallow: /directory/sub-directory2/subdirectory/ etc... I would end up having thousands of definitions to disallow all the possible sub-directory combinations. So, is the following way a correct, better and shorter way to define what I want above: allow: /directory/$ disallow: /directory/* Would the above work? Any thoughts are very welcome! Thank you in advance. Best, Fab.
Intermediate & Advanced SEO | | fablau1 -
Robots.txt, does it need preceding directory structure?
Do you need the entire preceding path in robots.txt for it to match? e.g: I know if i add Disallow: /fish to robots.txt it will block /fish
Intermediate & Advanced SEO | | Milian
/fish.html
/fish/salmon.html
/fishheads
/fishheads/yummy.html
/fish.php?id=anything But would it block?: en/fish
en/fish.html
en/fish/salmon.html
en/fishheads
en/fishheads/yummy.html
**en/fish.php?id=anything (taken from Robots.txt Specifications)** I'm hoping it actually wont match, that way writing this particular robots.txt will be much easier! As basically I'm wanting to block many URL that have BTS- in such as: http://www.example.com/BTS-something
http://www.example.com/BTS-somethingelse
http://www.example.com/BTS-thingybob But have other pages that I do not want blocked, in subfolders that also have BTS- in, such as: http://www.example.com/somesubfolder/BTS-thingy
http://www.example.com/anothersubfolder/BTS-otherthingy Thanks for listening0 -
How is Google crawling and indexing this directory listing?
We have three Directory Listing pages that are being indexed by Google: http://www.ccisolutions.com/StoreFront/jsp/ http://www.ccisolutions.com/StoreFront/jsp/html/ http://www.ccisolutions.com/StoreFront/jsp/pdf/ How and why is Googlebot crawling and indexing these pages? Nothing else links to them (although the /jsp.html/ and /jsp/pdf/ both link back to /jsp/). They aren't disallowed in our robots.txt file and I understand that this could be why. If we add them to our robots.txt file and disallow, will this prevent Googlebot from crawling and indexing those Directory Listing pages without prohibiting them from crawling and indexing the content that resides there which is used to populate pages on our site? Having these pages indexed in Google is causing a myriad of issues, not the least of which is duplicate content. For example, this file <tt>CCI-SALES-STAFF.HTML</tt> (which appears on this Directory Listing referenced above - http://www.ccisolutions.com/StoreFront/jsp/html/) clicks through to this Web page: http://www.ccisolutions.com/StoreFront/jsp/html/CCI-SALES-STAFF.HTML This page is indexed in Google and we don't want it to be. But so is the actual page where we intended the content contained in that file to display: http://www.ccisolutions.com/StoreFront/category/meet-our-sales-staff As you can see, this results in duplicate content problems. Is there a way to disallow Googlebot from crawling that Directory Listing page, and, provided that we have this URL in our sitemap: http://www.ccisolutions.com/StoreFront/category/meet-our-sales-staff, solve the duplicate content issue as a result? For example: Disallow: /StoreFront/jsp/ Disallow: /StoreFront/jsp/html/ Disallow: /StoreFront/jsp/pdf/ Can we do this without risking blocking Googlebot from content we do want crawled and indexed? Many thanks in advance for any and all help on this one!
Intermediate & Advanced SEO | | danatanseo0 -
How to do geo targeting for domain and sub directories in Webmaster tool?
Hello All, How can i do geo targeting in multiple countries on my ** root domain and sub **directories in Webmaster tool. My domain is "abc.com" and i want to target three countries UAE , Kuwait, Saudi arabia. So, Can i assign geo targeting in Webmaster tool , Root domain for UAE country and make other two sub directories for Kuwait and saudi ? abc.com - UAE (geo targeting) abc.com/kw - Kuwait (geo targeting) abc.com/sa - Saudi (geo targeting) Or Root doamain should be not assign for any country and Make three sub directories for UAE, Kuwait , and saudi and targeting them there geo locations. abc.com - Unlisted (geo targeting) abc.com/uae/ - UAE (geo targeting) abc.com/kw/ - Kuwait (geo targeting) abc.com/sa/ - Saudi (geo targeting)
Intermediate & Advanced SEO | | rahul110 -
Optimize a Classifieds Site
Hi, I have a classifieds website and would like to optimize it. The issues/questions I have: A Classifieds site has, say, 500 cities. Is it better to create separate subdomains for each city (http://city_name.site.com) or subdirectory (http://site.com/city_name)? Now in each city, there will be say 50 categories. Now these 50 categories are common across all the cities. Hence, the layout and content will be the same with difference of latest ads from each city and name of the city and the urls pointing to each category in the relevant city. The site architecture of a classifieds site is highly prone to have major content which is not really a duplicate content. What is the best way to deal with this situation? I have been hit by Panda in April 2011 with traffic going down 50%. However, the traffic since then has been around same level. How to best handle the duplicate content penalty in case with site like a classifieds site. Cheers!
Intermediate & Advanced SEO | | ketan90