Help - we're blocking SEOmoz cawlers
-
We have a fairly stringent blacklist and by the looks of our crawl reports we've begin unintentionally blocking the SEOmoz crawler.
can you guys let me know the useragent string and anything else I need to enable mak sure you're crawlers are whitelisted?
Cheers!
-
Hi Keri,
Still testing, though i see no reason why this shouldn't work so will close the QA ticket.
cheers!
-
Hi! Did this work for you, or would you like our help team to lend a hand?
-
We maintain a crawler (and others) blacklist to control server loads, so I'm just looking for the useragent string I can add to the white list. this one should do the trick;
Mozilla/5.0 (compatible; rogerBot/1.0; UrlCrawler; http://www.seomoz.org/dp/rogerbot)
-
Still way to early for me ;-). I block specific robots rather than excluding all but a few.
I have not tried the following (but think/hope it will work) - this should block all robots, but allow SeoMoz and Google:
User-agent: *
Disallow: /User-agent: rogerbot
Disallow:User-agent: Google
Disallow:You would already have something like this in your robots.txt (unless your block occurs on a network/firewall level).
-
Thanks Gerd, though looks like your robots.txt is a disallow rule, when I'm looking to let it through.
I'll give this one a try: Mozilla/5.0 (compatible; rogerBot/1.0; UrlCrawler; http://www.seomoz.org/dp/rogerbot)
-
I have it as "rogerbot"
<code>User-agent: rogerbot Disallow: /</code>
Access-log: Mozilla/5.0 (compatible; rogerBot/1.0; UrlCrawler; http://www.seomoz.org/dp/rogerbot)
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What to do about removing pages for the 'offseason' (IE the same URL will be brought back in 6-7 months)?
I manage a site for an event that runs annually, and now that the event has concluded we would like to remove some of the pages (schedule, event info, TV schedule, etc.) that won't be relevant again until next year's event. That said, if we simply remove those pages from the web, I'm afraid that we'll lose out on valuable backlinks that already exist, and when those pages return they will have the same URLs as before. Is there a best course of action here? Should I redirect the removed pages to the homepage for the time being using a 302? Is there any risk there if the 'temporary' period is ~7 months? Thanks in advance.
Technical SEO | | KTY550 -
301 Redirect Help
How would you 301 redirect and entire folder to a specific file within the same domain? Scenario www.domain.com/folder to www.domain.com/file.html Thanks for your Input...
Technical SEO | | dhidalgo11 -
Ways of Helping Reducing Duplicate Content.
Hi I am looking to no of anyway there is at helping to reduce duplicate content on a website with out breaking link and affecting Google rankings.
Technical SEO | | Feily0 -
What's the issue?
Hi, We have a client who dropped in the rankings (initially from bottom of the first page to page to page 3, and now page 5) for a single keyword (their most important one - targeted on their homepage) back in the middle of March. So far, we've found that the issue isn't the following: Keyword stuffing on the page External anchor text pointing to the page Internal anchor text pointing to the page In addition to the above, the drop didn't coincide with panda or penguin. Any other ideas as to what could cause such a drop for a single keyword (other related rankings haven't moved). We're starting to think that this may just have been another small change in the algorithm but it seems like too big of a drop in a short space of time for that to be the case. Any thoughts would be much appreciated! Thanks.
Technical SEO | | jasarrow0 -
Why isnt my Site getting Re Crawled !!!
The last crawl for my website was done long back ! What is the use if you dont re-crawl ? My website is www.caraccessoriesdelhi.in I have corrected all the errors. When will you recrawl it ? This is absolutely not done ! The last crawl was done on 3rd May !
Technical SEO | | VarunBansal0 -
Does removing product listings help raise SERP's on other pages?
Does removing content ever make sense? We have out of stock products that are left on the site (in an out of stock section) specifically for SEO value, but I am not sure how to approach the problem from a bottom line conversion stand point. Do we leave out of stock products and hope that they turn into a conversion rate via cross selling, or do out of stock products lower the value of other pages by "stealing" link juice and pagerank from the rest of the site? (and effectively driving interest away) What is your perspective? Do you believe that any content that is related or semi-related to your main focus is beneficial, or does it only make sense to have strong content that has a higher rate of conversion and overall site engagement?
Technical SEO | | 13375auc30 -
Is SEOMoz only good for "ideas"?
Perhaps I've learned too much about the technical aspects of SEO, but nowhere have I found scientific studies backing up any claims made here, or a useful answer to a discussion I recently started. Maybe it doesn't exist. I do enjoy Whiteboard Friday's. They're fantastic for new ideas. This site is great. But I take it there are no proper studies conducted that examine SEO, rather just the usual spin of "belief from authority". No?
Technical SEO | | stevenheron0 -
Best blocking solution for Google
Posting this for Dave SottimanoI Here's the scenario: You've got a set of URLs indexed by Google, and you want them out quickly Once you've managed to remove them, you want to block Googlebot from crawling them again - for whatever reason. Below is a sample of the URLs you want blocked, but you only want to block /beerbottles/ and anything past it: www.example.com/beers/brandofbeer/beerbottles/1 www.example.com/beers/brandofbeer/beerbottles/2 www.example.com/beers/brandofbeer/beerbottles/3 etc.. To remove the pages from the index should you?: Add the Meta=noindex,follow tag to each URL you want de-indexed Use GWT to help remove the pages Wait for Google to crawl again If that's successful, to block Googlebot from crawling again - should you?: Add this line to Robots.txt: DISALLOW */beerbottles/ Or add this line: DISALLOW: /beerbottles/ "To add the * or not to add the *, that is the question" Thanks! Dave
Technical SEO | | goodnewscowboy0