Slowing down SEOmoz Crawl Rate
-
Is there a way to slow down SEOmoz crawl rate? My site is pretty huge and I'm getting 10k pages crawled every week, which is great. However I sometimes get multiple page requests in one second which slows down my site a bit.
If this feature exists I couldn't find it, if it doesn't, it's a great idea to have, in a similar way to how Googlebot do it.
Thanks.
-
Thank you for the reply Megan, just what I was looking for.
-
Hi corwin,
This is Megan from the SEOmoz Help Team. I'm sorry if roger is being a bit too aggressive. We built our crawler to obey robots.txt crawl-delay directives. If you ever need to speed him down (though he's usually well-behaved), just add a crawl delay directive to your robots.txt file like this:
User-agent: rogerbotCrawl-delay: 1
Here's a good article that explains more about this technique: http://tools.seobook.com/robots-txt/
I hope this helps!
Cheers
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Seomoz crawling filtered pages
Hi, I just checked an seo campaign we started last week, so I opened seomoz to see the crawl diagnostics. Lot's of duplicate content & duplicate titles showing up, but that's because Rogerbot is crawling all of the filtered pages as well. How do I exclude these pages from being crawled? /product/brand-x/3969?order=brand&sortorder=ASC
Moz Pro | | nvs.nim
/product/brand-x/3969?order=popular&sortorder=ASC
/product/brand-x/3969?order=popular&sortorder=DESC&page=10
/product/brand-x/3969?order=popular&sortorder=DESC&page=110 -
If there a basic overview of the SEOMoz Pro tools?
I'm surprised there is no simple overview of all the features anywhere, or a guide on how to use the tools? There's a nice intro video, and a good beginners guide, then BAM! - straight into the Pro Tools (it's a bit daunting for someone learning!) Or at least that I can find... and help or pointers would be great 🙂
Moz Pro | | seanuk0 -
Crawl slow again
Once again the weekly crawl on my site is very slow. I have around 441 pages in the crawl and this has been running for over 12 hours. This last happened two weeks ago (ran for over 48 hours). Last week's crawl was much quicker (not sure exactly how long but guessing an hour or so). Is this a known issue and is there anything that can be done to unblock it? Weekends are the best time for me to assess and respond to changes I have made to my site so having this (small) crawl take most of the weekend is really quite problematic. Thanks. Mark
Moz Pro | | MarkWill0 -
Can I exclude pages from my Crawl Diagnostics?
Right now my crawl diagnostic information is being skewed because it's including the onsite search from my website. Is there a way to remove certain pages like search from the errors and warnings of the crawl diagnostic? My search pages are coming up as: Long URL Title Element Too Long Missing Meta Description Blocked by meta-robots (Which is how I want it) Rel Canonical Here is what the crawl diagnostic thinks my page URL looks like: website.com/search/gutter%25252525252525252525252525252525252525252525252525252525 252525252525252525252525252525252525252525252525252525252525252 525252525252525252525252525252525252525252525252525252525252525 252525252525252525252525252525252525252525252525252525252525252 52525252525252525252525252525252525252525252525252Bcleaning/ Thank you, Jonathan
Moz Pro | | JonathanGoodman0 -
A suggestion to help with linkscape crawling and data processing
Since you guys are understandably struggling with crawling and processing the sheer number of URLs and links, I came up with this idea: In a similar way to how SETI@Home (is that still a thing? Google says yes: http://setiathome.ssl.berkeley.edu/) works, could SEOmoz use distributed computing amongst SEO moz users to help with the data processing? Would people be happy to offer up their idle processor time and (optionally) internet connections to get more accurate, broader data? Are there enough users of the data to make distributed computing worthwhile? Perhaps those who crunched the most data each month could receive moz points or a free month of Pro. I have submitted this as a suggestion here:
Moz Pro | | seanmccauley
http://seomoz.zendesk.com/entries/20458998-crowd-source-linkscape-data-processing-and-crawling-in-a-similar-way-to-seti-home1 -
Campaign Not Crawling
I set up my first 5 campaigns and one is not crawling beyond one-page. It's been over 48 hours. This site has nearly 3.500 pages the others much less, however, this shouldn't make any difference. I searched for the problem and couldn't find it so I hope this question isn't redundant. Comments and advice would be appreciated.
Moz Pro | | JavaManOne0 -
2nd Crawl taking too long?
Hi, I've added a campaign to my account with the first crawl taking around a week. The 2nd crawl started 3days 17 hours ago and si still running. Is this something that others have experienced? The campaign is tracking 5 keywords and have 17 pages on the site. Steve
Moz Pro | | stevecounsell0