Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Is it good practice to still pay for Best of the Web Directory (BOTW) and other similar one's you have to pay for?
-
I know that paid for links are hit by Google, but in the past these directories were okay. What about now?
Thank you.
-
"If you pay for a directory link only to gain position in the SERPs you will most probably not get any ROI"
Meaning if you are on the first page there won`t be ROI? I dont think so!
-
There are exceptions to the "paid for" link rule and I believe this site is one of them as they qualify there links! However whether you'll get ROI from it is another question!
-
Hi,
Well in general I agree with Arjan, I am strongly of the opinion that some directories are still VERY worthwhile, and BOTW is one of them. When you are launching a brand new site, you just aren't going to have a lot of attention and inbound lin opportunities at first. Submitting your site to DMOZ (which is free), Yahoo.dir (Paid), Business.com (Paid), JoeAnt.com (Paid) and yes, BOTW (Paid) are all most definitely worthwhile. IMHO they send a signal to the search engines that you are a legitimate business. That's a very important message to send when you are a startup.
In addition to those I think there are some others that are worth it as well, depending on your particular business, particularly localized directories and directories targeting certain niche markets.
Hope that's helpful!
Dana
-
My advice will be to only pay for directory links that drive enough traffic to your website directly to get a proper ROI.
If you pay for a directory link only to gain position in the SERPs you will most probably not get any ROI.
If you get the link from a directory Google does not like because it is known to be selling links it could even hurt your rankings.In short: Paid directory links are not good for SEO.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google has deindexed a page it thinks is set to 'noindex', but is in fact still set to 'index'
A page on our WordPress powered website has had an error message thrown up in GSC to say it is included in the sitemap but set to 'noindex'. The page has also been removed from Google's search results. Page is https://www.onlinemortgageadvisor.co.uk/bad-credit-mortgages/how-to-get-a-mortgage-with-bad-credit/ Looking at the page code, plus using Screaming Frog and Ahrefs crawlers, the page is very clearly still set to 'index'. The SEO plugin we use has not been changed to 'noindex' the page. I have asked for it to be reindexed via GSC but I'm concerned why Google thinks this page was asked to be noindexed. Can anyone help with this one? Has anyone seen this before, been hit with this recently, got any advice...?
Technical SEO | | d.bird0 -
Topic Cluster: URL Best Practices
I'm trying to be mature and employ the Topic Cluster strategy to my content. In doing so I realized there are a few URL options. Some more difficult to execute than others. -Is it important to call out the Pillar Topic in your subtopic URL?
Technical SEO | | dkellyagile
-Does the Pillar Topic need to have its own landing page? (As opposed to just being part of the blog.) Here's an Example: My Pillar is: Inbound vs. Outbound
My subtopic is: Marketing Platforms Here are the URL options I can think of... Option 1: https://pipelineinbound.com/blog/inbound-vs-outbound-marketing-platforms/ Option 2: https://pipelineinbound.com/blog/which-marketing-platforms/ Option 3: https://pipelineinbound.com/blog/marketing-platforms-inbound-vs-outbound/ Option 4 (Hardest): https://pipelineinbound.com/inbound-vs-outbound/marketing-platforms/ Are there some fundamental best practices for URL structure and Link Building as it pertains to Topic Clusters? Thanks!0 -
What is SEO best practice to implement a site logo as an SVG?
What is SEO best practice to implement a site logo as an SVG?
Technical SEO | | twisme
Since it is possible to implement a description for SVGs it seems that it would be possible to use that for the site name. <desc>sitename</desc>
{{ STUFF }} There is also a title tag for SVGs. I’ve read in a thread from 2015 that sometimes it gets confused with the title tag in the header (at least by Moz crawler) which might cause trouble. What is state of the art here? Any experiences and/or case studies with using either method? <title>sitename</title>
{{ STUFF }} However, to me it seems either way that best practice in terms of search engines being able to crawl is to load the SVG and implement a proper alt tag: What is your opinion about this? Thanks in advance.1 -
New theme adds ?v=1d20b5ff1ee9 to all URL's as part of cache. How does this affect SEO
New theme I am working in ads ?v=1d20b5ff1ee9 to every URL. Theme developer says its a server setting issue. GoDaddy support says its part of cache an becoming prevalent in new themes. How does this impact SEO?
Technical SEO | | DML-Tampa0 -
Best practice for URL - Language/country
Hi, We are planning on having our website localized into more languages. We already have an English and German version. The German version is currently a sub-domain: www.example.com --> English version de.example.com --> German version Is this recommended? Or is it always better to have URLs with language prefixes such a: www.example.com/de www.example.com/es Which is a better practice in terms of SEO?
Technical SEO | | Kilgray1 -
ECommerce: Best Practice for expired product pages
I'm optimizing a pet supplies site (http://www.qualipet.ch/) and have a question about the best practice for expired product pages. We have thousands of products and hundreds of our offers just exist for a few months. Currently, when a product is no longer available, the site just returns a 404. Now I'm wondering what a better solution could be: 1. When a product disappears, a 301 redirect is established to the category page it in (i.e. leash would redirect to dog accessories). 2. After a product disappers, a customized 404 page appears, listing similar products (but the server returns a 404) I prefer solution 1, but am afraid that having hundreds of new redirects each month might look strange. But then again, returning lots of 404s to search engines is also not the best option. Do you know the best practice for large ecommerce sites where they have hundreds or even thousands of products that appear/disappear on a frequent basis? What should be done with those obsolete URLs?
Technical SEO | | zeepartner1 -
Javascript to manipulate Google's bounce rate and time on site?
I was referred to this "awesome" solution to high bounce rates. It is suppose to "fix" bounce rates and lower them through this simple script. When the bounce rate goes way down then rankings dramatically increase (interesting study but not my question). I don't know javascript but simply adding a script to the footer and watch everything fall into place seems a bit iffy to me. Can someone with experience in JS help me by explaining what this script does? I think it manipulates the reporting it does to GA but I'm not sure. It was supposed to be placed in the footer of the page and then sit back and watch the dollars fly in. 🙂
Technical SEO | | BenRWoodard1 -
Blocking URL's with specific parameters from Googlebot
Hi, I've discovered that Googlebot's are voting on products listed on our website and as a result are creating negative ratings by placing votes from 1 to 5 for every product. The voting function is handled using Javascript, as shown below, and the script prevents multiple votes so most products end up with a vote of 1, which translates to "poor". How do I go about using robots.txt to block a URL with specific parameters only? I'm worried that I might end up blocking the whole product listing, which would result in de-listing from Google and the loss of many highly ranked pages. DON'T want to block: http://www.mysite.com/product.php?productid=1234 WANT to block: http://www.mysite.com/product.php?mode=vote&productid=1234&vote=2 Javacript button code: onclick="javascript: document.voteform.submit();" Thanks in advance for any advice given. Regards,
Technical SEO | | aethereal
Asim0