Macrae's Blue Book Directory LIsting
-
Does anyone know more information about this directory? Is it a good quality directory that I should pay to get listed on?
-
I believe so. I did notice a small upward trend for long tails of product categories on our site that we had on MacRae's. I can't say for so, but it definitely did not hurt.
-
Thanks for the information, did you see any benefits for SEO?
-
We have been using MacRae's for a number of years and it returns a positive ROI. It's a good quality directory and we use it for industrial B2B advertising (we also use ThomasNet as well). Others include: MFG.com/Globalspec.com/Alibaba (however--we don't use these anymore).
-
A good directory, especially those that have an actual offline as well as online presence are still a valid marketing route.
So long as your business has a north american trading address you can get a free listing from them here:
http://www.macraesbluebook.com/getlisted/get-listed.cfm
Good luck.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Click Through's for ranking
Back in April of 2014, Rand performed an experiment to determine if Google clicks-throughs made a difference on rankings. He Tweeted and asked people to search on a specific term, and then click on a specific listing, to determine if the immediate clicks made a difference. Within 2.5 hours, his search listing went from #10 position to #1 position. My question is this: If this experiment still works today, could you right click, copy link address of the SERP listing from Google's page and put it in a Facebook or Twitter post, and receive the same results? Or would this be gaming the system? Here is an example of the link: https://www.google.com/url?sa=t&rct=j&q=&esrc=s&source=web&cd=10&cad=rja&uact=8&ved=0ahUKEwiaqZD9-cXLAhUKyWMKHfFID70QFghYMAk&url=http%3A%2F%2Fbuzzy4shots.com%2Ffocus-pain-relief%2F&usg=AFQjCNElHaso_vXP4rWQdsaX1JdP8IItMQ&sig2=Sg9r6zSbW0pZQtb4ZbzJqg&bvm=bv.117218890,d.cGc
White Hat / Black Hat SEO | | tdawson090 -
Controlling crawl speed/delay through dynamic server-code and 503's
Lately i'm experiencing performance trouble caused by bot traffic. Although Googlebot is not the worst (it's mainly bingbot and ahrefsbot), they cause heavy server load from time to time. We run a lot of sites on one server, so heavy traffic on one site impacts other site's performance. Problem is that 1) I want a centrally managed solution for all sites (per site administration takes too much time), which 2) takes into account total server-load in stead of only 1 site's traffic and 3) controls overall bot-traffic in stead of controlling traffic for one bot. IMO user-traffic should always be prioritized higher than bot-traffic. I tried "Crawl-delay:" in robots.txt, but Googlebot doesn't support that. Although my custom CMS system has a solution to centrally manage Robots.txt for all sites at once, it is read by bots per site and per bot, so it doesn't solve 2) and 3). I also tried controlling crawl-speed through Google Webmaster Tools, which works, but again it only controls Googlebot (and not other bots) and is administered per site. No solution to all three of my problems. Now i came up with a custom-coded solution to dynamically serve 503 http status codes to a certain portion of the bot traffic. What traffic-portion for which bots can be dynamically (runtime) calculated from total server load at that certain moment. So if a bot makes too much requests within a certain period (or whatever other coded rule i'll invent), some requests will be answered with a 503 while others will get content and a 200. Remaining question is: Will dynamically serving 503's have a negative impact on SEO? OK, it will delay indexing speed/latency, but slow server-response-times do in fact have a negative impact on the ranking, which is even worse than indexing-latency. I'm curious about your expert's opinions...
White Hat / Black Hat SEO | | internetwerkNU1 -
Website starts ranking on Google then always drops - Targeted for Australia but most traffic from U.S - Bounce Rate at 94.49% - HELP!
Hi everyone, Thank you for your time. During the past 8 months I have been working on this website which is a .com.au . I have fully optimised the website which is targeting Brisbane in Australia and I have setup everything (Sitemaps, Geo location on WMT, Fetched as Google etc..) However the website just does not want to rank at all. I know that the previous SEO company were not too good but since then I have disavowed all unnatural links, we have moved the hosting to a new company and the website content has been updated. Only recently the Website has started ranking for it's brand name (not even in top of Google) and whenever a keyword starts ranking above the Top 50 of Google it suddenly drops again. The other issues is that even if I have setup the website to target Australia the majority of traffic comes from the U.S. Last month out of the 127 Session - 85 from United States - 29 from Australia - 3 Brazil - 2 India - 2 Italy - 1 Canada etc... Because of this the website has a Bounce rate of 95%. If you would have any advice, tips or recommendations that I could do to try and fix this it would be much appreciated. I suppose we can consider this as some kind of penalisation - potentially due to the past work and issues that occurred before the business became our client but I am not sure what more I can do to stop the wrong traffic and improve the rankings. Thanks for your help. Lyam
White Hat / Black Hat SEO | | AlphaDigital20 -
Penguin 2.1 Penalty- Can't Understand why hit by it?
Hi, I have lost all my rankings after Penguin 2.1 update. I haven't did anything wrong. Want to know the root cause of the penalty so that I can overcome this. Any help would be appreciated. Website: http://tiny.cc/hfom4w
White Hat / Black Hat SEO | | chandman0 -
LOCAL SEO / Ranking for the difficult 'service areas' outside of the primary location?
It's generally not too hard to rank in Google Places and organically for your primary location. However if you are a service area business looking to rank for neighboring cities or service areas, Google makes this much tougher. Andrew Shotland mentions the obvious and not so obvious options: Service Area pages ranking organically, getting a real/virtual address, boost geo signals, and using zip codes instead of service area circle. But I am wondering if anyone had success with other methods? Maybe you have used geo-tagging in a creative way? This is a hurdle that many local business are struggling with and any experience or thoughts will be much appreciated
White Hat / Black Hat SEO | | vmialik1 -
It's not link buying, but...
Which of these strategies, if any, cross the line from relationship building to link buying? Assume all links are do-follow. You're a local business. You give the local Boys & Girls club a few hundreds buck a year. In return, you get a very nice link on their Sponsorship page for 12 months. You send a sample of your product to influential bloggers, for the purpose of a review and hopefully a link back to your website. One of your clients is a college bar. You invite 50 college kids over for a slow evening and stuff them full of chicken wings. Then, you ask them to please review and link to the bar on their college wiki. You give a client a free service, in exchange for that client linking to your business on its blog roll. You take a blogger out to lunch, and pick up the tab. Later that day, the blogger writes up an amusing little story for the blog, and links back to your desired website. In your email newsletter, you put out a request to your customer base, "Please link to my website, and I'll provide you a special 20% off coupon."
White Hat / Black Hat SEO | | ExploreConsulting1 -
I am experiencing referrer spam from http://r-e-f-e-r-e-r.com/ (don't click) - What should I do?
It amazes me that every day in search marketing is filled with something new that I don't know or never heard of. Most of you are probably familiar with referrer spam, but I hadn't ever heard of it before. I am currently experiencing referral spam on my personal blog. What's the best way to get rid of this pest? Shall I ignore them? Block them in my robots.txt file? Use Google's Disavow? or should I just plain holler "Curse you referral spam people!!!" ? Thanks all!
White Hat / Black Hat SEO | | danatanseo0 -
Big loss in Google traffic recently, but can't work out what the problem is
Since about May 17 my site - http://lowcostmarketingstrategies.com - has suffered a big drop in traffic from Google, presumed from the dreaded Penguin update. I am at a loss why I have been hit when I don't engage in any black hat SEO tactics or do any link building. The site is high quality, provides a good experience for the user and I make sure that all of the content is unique and not published elsewhere. The common checklist of potential problems from Penguin (such as keyword stuffing, web spam and over optimisation in general) don't seem relevant to my site. I'm wondering if someone could take a quick look at my site to see any obvious things that need to be removed to get back in Google's good books. I was receiving around 200 - 250 hits per day, but that has now dropped down to 50 - 100 and I fee that I have been penalised incorrectly. Any input would be fantastic Thanks 🙂
White Hat / Black Hat SEO | | ScottDudley0