Rogerbot getting cheeky?
-
Hi SeoMoz,
From time to time my server crashes during Rogerbot's crawling escapades, even though I have a robots.txt file with a crawl-delay 10, now just increased to 20.
I looked at the Apache log and noticed Roger hitting me from from 4 different addresses 216.244.72.3, 72.11, 72.12 and 216.176.191.201, and most times whilst on each separate address, it was 10 seconds apart, ALL 4 addresses would hit 4 different pages simultaneously (example 2). At other times, it wasn't respecting robots.txt at all (see example 1 below).
I wouldn't call this situation 'respecting the crawl-delay' entry in robots.txt as other question answered here by you have stated. 4 simultaneous page requests within 1 sec from Rogerbot is not what should be happening IMHO.
example 1
216.244.72.12 - - [05/Sep/2012:15:54:27 +1000] "GET /store/product-info.php?mypage1.html" 200 77813
216.244.72.12 - - [05/Sep/2012:15:54:27 +1000] "GET /store/product-info.php?mypage2.html HTTP/1.1" 200 74058
216.244.72.12 - - [05/Sep/2012:15:54:28 +1000] "GET /store/product-info.php?mypage3.html HTTP/1.1" 200 69772
216.244.72.12 - - [05/Sep/2012:15:54:37 +1000] "GET /store/product-info.php?mypage4.html HTTP/1.1" 200 82441example 2
216.244.72.12 - - [05/Sep/2012:15:46:15 +1000] "GET /store/mypage1.html HTTP/1.1" 200 70209
216.244.72.11 - - [05/Sep/2012:15:46:15 +1000] "GET /store/mypage2.html HTTP/1.1" 200 82384
216.244.72.12 - - [05/Sep/2012:15:46:15 +1000] "GET /store/mypage3.html HTTP/1.1" 200 83683
216.244.72.3 - - [05/Sep/2012:15:46:15 +1000] "GET /store/mypage4.html HTTP/1.1" 200 82431
216.244.72.3 - - [05/Sep/2012:15:46:16 +1000] "GET /store/mypage5.html HTTP/1.1" 200 82855
216.176.191.201 - - [05/Sep/2012:15:46:26 +1000] "GET /store/mypage6.html HTTP/1.1" 200 75659Please advise.
-
Hi BM7,
I'm going to open up a ticket on this to have our engineers take a closer look at your site. Once we have an overall response, I'll post it here for other community members to view.
Cheers!
-
Thanks Megan for your reply,
Will give that a try and have blocked 2 addresses so you are reduced to 2 crawler sessions. These two measures should reduce the load considerably as long as Rogerbot respects the 7 second delay.
IMHO ignoring the Crawl-Delay set by the webmaster of the site you are crawling, which crawlers are supposed to respect, is wrong. I got a Google WMT nasty for being down 5 hours due to Rogerbot as it was the middle of the night so only got restarted in the morning.
Also, my site has around 600 discrete pages of which you crawl about 500, so even at the original 10 seconds crawl delay you could do my whole site in less than 1.5 hours, which is only required once a week. So in my mind that suggests there is no need to overrule my settings in robots.txt 'so he (Roger) can complete the crawl'.
Regards,
-
Hi there,
This is Megan from the SEOmoz Help Team. I'm so sorry Rogerbot is causing you grief! This actually might be happening because your crawl delay is too long, so rogerbot just ends up ignoring it so he can complete the crawl. If you set your crawl delay to a max of 7, then it should solve your problem. If you're still running into issues, though, please send us a message to help@seomoz.org and we'll check it out asap!
Cheers!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Unsolved Getting keywords to rank on new landing pages
I've built new landing pages for a website, and have loaded them with researched keywords in the content, alt image attributes and metas etc - but, after a number of crawls, the keywords are currently being matched to other existing web pages on the website. Does anyone have any advice on 'unlatching' these keywords from these pages, and instead getting them to match with pages that have been optimised for them? Many thanks!
Moz Pro | | Darkstarr6660 -
Where To Get a *Hint of My Beta Invite Status?
Wondering if I'm missing a thread somewhere with updates on new analytics dashboard (beta) invites? I'm not surprised I'm not first in line, but be nice to know if it's 2013...or 14?...or 15? 🙂
Moz Pro | | InsightRiot300 -
Why did I not get on page analysis
I signed up last Friday for a 30 day trial and was hoping to get the on page analysis so I could get cracking on it. Moz said the crawl was done but I have no info for on page analysis which I find really useful when I have used this before. Any ideas and are any of the MOZ moderators able to get the system to perform one so I can make changes tonight for the MOZ crawl again tomorrow? Thanks Paul
Moz Pro | | ptrobson0 -
Why am I not getting my allowance of 10,000 inbound links in csv download file? 370 out of 4700??
Hi, I'm desparately trying to audit my backlinks to remove a penguin penalty on my site livefit.co.uk When I do the inbound link report i'm not getting all the links in the download. I know there is a limit of 25 links from each linking site so we get the full picture of links bu: I have 4700 links so why does it need to limit it when we are supposed to see up to 10,000? When you check the link profile on the report it doesn't seem there are many sites with anything close to 25, so surely that rule is invalid as an explanation here? Should I just work off OSE? But there is less useful info than on the csv.. I'd be very grateful for your thoughts. Thanks! James
Moz Pro | | LiveFit0 -
How to get rogerbot whitelisted for application firewalls.
we have recently installed an application firewall that is blocking rogerbot from crawling our site. Our IT department has asked for an IP address or range of IP addresses to add to the acceptable crawlers. If rogerbot has a dynamic IP address how to we get him added to our whitelist? The product IT is using is from F5 called Application Security Manager.
Moz Pro | | Shawn_Huber0 -
What do you get with mozpoints?
What is the point of collecting mozpoints? I read that you are able to purchase features, but what other perks are there with collecting mozpoints?
Moz Pro | | ReadyArtwork0 -
Newbie - help me get started, please :)
Hi Guy's, I am super excited to be here and looking fwd to getting to know you all. As the title suggests I am a complete newbie to the world of SEO I am very keen to learn, I just need pointing in the right direction. My website is live! Now I need to make it super populated, my site is an e-commerce website selling natural / organic beauty products for men women & children. I would like to make it one of the best out there and I am not worried about putting the hours in to achieve that goal. So my question to you guys is this.... where do I start?
Moz Pro | | dan1el0 -
How often does SEOmoz reports get refreshed?
Is there a way I can refresh the reports manually instead of waiting for it to pull the updated data?
Moz Pro | | RBA0