Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Directory and Classified Submissions
-
Are directory submissions and Classified Submissions still a good way to create backlinks?
Or they are obsolete methods and should be discontinued?
-
Thanks for the awesome comments Cyrus.
So what you suggest is going slow and develop solid, long term and genuine links by commenting to blog posts in the same niche, good quality PR submissions. good quality Article submissions and making relevant form posts?
Would you like to add something to the above list?
-
Yes, Google is smart enough.
Two years ago Google stripped toolbar PageRank from most of the profiles on SEOmoz. The links here still pass some value, but build too many of these links and you're much more likely to incur a penalty today.
This entire school of link building discussed on this page is dangerous - and filled with snake oil salesman who care nothing about harming your rankings. I've never heard of Linklecious, but it looks like they've been de-indexed by Google, so it's safe to say they were penalized.
Instead of risking burning your site to the ground with low quality links, invest your time in long-term payouts by producing good quality content and earning the links others can't earn.
-
Hi KS,
In a short answer, directory submissions have been abused over the years and we've seen a marked decrease in their effectiveness even as recently as the past 12 months.
The old rule for directory links was to build them in a 2:1 ratio. That meant for every two directory links, be sure to build at least 1 regular, high quality link. Today, the ratio is more like 1:1, or even reversed to 1:2.
If a directory is easy to get into, it's probably not worth your time. Too many of these links can lead to a drop in rankings. Done judiciously, they can give a small boost to your rankings, help round out your link profile, and help target specific anchor text phrases (again, when done in moderation)
Here's an article we published a few months back you might find helpful: http://www.seomoz.org/blog/seo-link-directory-best-practices
As for classified submissions, I'd be wary as I've never seen any evidence that they help SEO, and like low value directory links, too many "easy" links can harm your rankings.
Hope this helps. Best of luck with your SEO!
-
Thanks Herald.
Please reply to my query regarding Profile Links and Web2.0 Links later down on this page.
-
Yes you should definitely continue with the directory & Classified submission. It will help you a lot. If you have any query then surely concerned with me.
Thanks
-
What about Profile Links and Web2.0 Links. They say its good to create 100+ profile links every month with appropriate keywords (while SEOing a website). Some say Profile links are better than Form Links or Blog Comments.
If I use services to create them, most of the links turn out to be on not-so-good Websites. But they say at the end of the day its about Backlinks.
My question is: Isn't Google smart enough to detect such practices?
In other word: Do profile links really help?
-
Hmmm useful tips. Thanks.
How about submitting those pages (URLs) to Linklecious or Pingomatic so they are crawled by Google?
Would that help?
-
If they follow then you could also check:
-
whether the directory actually has some ranking already and has been around for a while
-
how many links they usually put on one page before paging to the next one
-
check if the listing pages of the recently added entries (usually last ones) are already in Google index - I normally do it by checking their PageRank - if it's at least white (I use Quirk SearchStatus plugin for Firefox) that means Google is already aware of them - which indicates that they crawl this directory pretty frequently for new content / pages
-
try to perform a search on Google for a specific keyword - the category you want to submit your link to and see what's their position for it
Obviously you would do all this if you had a lot of time to go through each directory separately, but it might be worth wile if you're planning to get some links in the different way then the generic links from the visitors.
-
-
Yup we always do make sure they are DO FOLLOW classified or directory sites.
-
Most of the submissions will give you a link with attribute rel set to nofollow - and these don't really give you any SEO benefit, so it is quite important to first check the other listings and see whether they have this attribute and value assigned to the anchor - if so, then the only benefit you will get is the visit if someone actually clicks on the link in the listing and gets to your site.
-
Thanks for the help. So basically we should continue with them right?
-
Hi KS_,
As you know that Directory submission mostly used by all the business companies for the promotion of their websites or products.
The directory & classified submissions are mainly used for getting listing in all major & useful directories.
The both above submissions are used according to the target area or population.Classified submissions are similar to yellow pages.The most important thing to remember while doing listing is that it should be of same niche where the listing is to be done. This will leads to indexed pages & rank higher in search engines & it will helps to gain visibility via search engines.
The above both submission helps to increase in sales of products, website presence specially in search engines.These will helps to gain non reciprocal back links from high Page Rank Websites/ Directories.
I hope that your query had been solved.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Should I index resource submission forms, thank you pages, etc.?
Should I index resource submission forms, thank you, event pages, etc.? Doesn't Google consider this content too thin?
Intermediate & Advanced SEO | | amarieyoussef0 -
Should I use the on classified listing pages that have expired?
We have went back and forth on this and wanted to get some outside input. I work for an online listing website that has classified ads on it. These ads are generated by companies on our site advertising weekend events around the country. We have about 10,000 companies that use our service to generate their online ads. This means that we have thousands of pages being created each week. The ads have lots of content: pictures, sale descriptions, and company information. After the ads have expired, and the sale is no longer happening, we are currently placing the in the heads of each page. The content is not relative anymore since the ad has ended. The only value the content offers a searcher is the images (there are millions on expired ads) and the descriptions of the items for sale. We currently are the leader in our industry and control most of the top spots on Google for our keywords. We have been worried about cluttering up the search results with pages of ads that are expired. In our Moz account right now we currently have over 28k crawler warnings alerting us to the being in the page heads of the expired ads. Seeing those warnings have made us nervous and second guessing what we are doing. Does anybody have any thoughts on this? Should we continue with placing the in the heads of the expired ads, or should we be allowing search engines to index the old pages. I have seen websites with discontinued products keeping the products around so that individuals can look up past information. This is the closest thing have seen to our situation. Any help or insight would be greatly appreciated! -Matt
Intermediate & Advanced SEO | | mellison0 -
How to outrank a directory listing with high DA but low PA?
My site is at 4th place, 3 places above it is a gumtree (similar to yell, yelp) listing. How can you figure out how difficult it would be outrank those pages? I mean obviously the pages would have low PA and they are top based on the high DA of the site. This also seems to go back to keyword research and difficulty, when I'm doing keyword research and I see a wikipedia site in top 5 rank, or a yell.com or perhaps an article in forbes.com outranks your site. Typically the problem seems to be Google giving a lot of credit to these pages rankings based on the high DA rather than PA of the pages. How would you gauge the difficulty of that keyword then if the competition are pages with very high DA which is impossible to compete with but low PA? Thanks
Intermediate & Advanced SEO | | magusara2 -
Citation/Business Directory Question...
A company I work for has two numbers... one for the std call centre and one for tracking SEO. Now, if local citation/business directory listings have the same address but different numbers, will this affect local/other SEO results? Any help is greatly appreciated! 🙂
Intermediate & Advanced SEO | | geniusenergyltd0 -
Robots.txt: how to exclude sub-directories correctly?
Hello here, I am trying to figure out the correct way to tell SEs to crawls this: http://www.mysite.com/directory/ But not this: http://www.mysite.com/directory/sub-directory/ or this: http://www.mysite.com/directory/sub-directory2/sub-directory/... But with the fact I have thousands of sub-directories with almost infinite combinations, I can't put the following definitions in a manageable way: disallow: /directory/sub-directory/ disallow: /directory/sub-directory2/ disallow: /directory/sub-directory/sub-directory/ disallow: /directory/sub-directory2/subdirectory/ etc... I would end up having thousands of definitions to disallow all the possible sub-directory combinations. So, is the following way a correct, better and shorter way to define what I want above: allow: /directory/$ disallow: /directory/* Would the above work? Any thoughts are very welcome! Thank you in advance. Best, Fab.
Intermediate & Advanced SEO | | fablau1 -
Archiving a festival website - subdomain or directory?
Hi guys I look after a festival website whose program changes year in and year out. There are a handful of mainstay events in the festival which remain each year, but there are a bunch of other events which change each year around the mainstay programming.This often results in us redoing the website each year (a frustrating experience indeed!) We don't archive our past festivals online, but I'd like to start doing so for a number of reasons 1. These past festivals have historical value - they happened, and they contribute to telling the story of the festival over the years. They can also be used as useful windows into the upcoming festival. 2. The old events (while no longer running) often get many social shares, high quality links and in some instances still drive traffic. We try out best to 301 redirect these high value pages to the new festival website, but it's not always possible to find a similar alternative (so these redirects often go to the homepage) Anyway, I've noticed some festivals archive their content into a subdirectory - i.e. www.event.com/2012 However, I'm thinking it would actually be easier for my team to archive via a subdomain like 2012.event.com - and always use the www.event.com URL for the current year's event. I'm thinking universally redirecting the content would be easier, as would cloning the site / database etc. My question is - is one approach (i.e. directory vs. subdomain) better than the other? Do I need to be mindful of using a subdomain for archival purposes? Hope this all makes sense. Many thanks!
Intermediate & Advanced SEO | | cos20300 -
URL Structure for Directory Site
We have a directory that we're building and we're not sure if we should try to make each page an extension of the root domain or utilize sub-directories as users narrow down their selection. What is the best practice here for maximizing your SERP authority? Choice #1 - Hyphenated Architecture (no sub-folders): State Page /state/ City Page /city-state/ Business Page /business-city-state/
Intermediate & Advanced SEO | | knowyourbank
4) Location Page /locationname-city-state/ or.... Choice #2 - Using sub-folders on drill down: State Page /state/ City Page /state/city Business Page /state/city/business/
4) Location Page /locationname-city-state/ Again, just to clarify, I need help in determining what the best methodology is for achieving the greatest SEO benefits. Just by looking it would seem that choice #1 would work better because the URL's are very clear and SEF. But, at the same time it may be less intuitive for search. I'm not sure. What do you think?0 -
Optimize a Classifieds Site
Hi, I have a classifieds website and would like to optimize it. The issues/questions I have: A Classifieds site has, say, 500 cities. Is it better to create separate subdomains for each city (http://city_name.site.com) or subdirectory (http://site.com/city_name)? Now in each city, there will be say 50 categories. Now these 50 categories are common across all the cities. Hence, the layout and content will be the same with difference of latest ads from each city and name of the city and the urls pointing to each category in the relevant city. The site architecture of a classifieds site is highly prone to have major content which is not really a duplicate content. What is the best way to deal with this situation? I have been hit by Panda in April 2011 with traffic going down 50%. However, the traffic since then has been around same level. How to best handle the duplicate content penalty in case with site like a classifieds site. Cheers!
Intermediate & Advanced SEO | | ketan90