Should we use Google's crawl delay setting?
-
We’ve been noticing a huge uptick in Google’s spidering lately, and along with it a notable worsening of render times.
Yesterday, for example, Google spidered our site at a rate of 30:1 (google spider vs. organic traffic.) So in other words, for every organic page request, Google hits the site 30 times.
Our render times have lengthened to an avg. of 2 seconds (and up to 2.5 seconds). Before this renewed interest Google has taken in us we were seeing closer to one second average render times, and often half of that.
A year ago, the ratio of Spider to Organic was between 6:1 and 10:1.
Is requesting a crawl-delay from Googlebot a viable option?
Our goal would be only to reduce Googlebot traffic, and hopefully improve render times and organic traffic.
Thanks,
Trisha
-
Unfortunately you can't change crawl settings for Google in a robots.txt file, they just ignore it. The best way to rate limit them is using custom Crawl settings in Google Webmaster Tools. (look under Site configuration > Settings)
You also might want to consider using your loadbalancer to direct Google (and other search engines) to a "condomised" group of servers (app, db, cache, search) thereby ensuring your users arent inadvertantly hit by perfomance issues caused by over zealous bot crawling.
-
We're a publisher, which means that as an industry our normal render times are always at the top of the chart. Ads are notoriously slow to load, and that's how we earn our keep. These results are bad, though, even for publishing.
We're serving millions of uniques a month, on a bank of dedicated servers hosted off site, load balanced, etc.
-
more info on that here: http://www.robotstxt.org/
-
Wow! those are really high render times. Have you considered perhaps moving to another webserver? NginX is pretty damm fast, and could probably get those render times down. Also, are you on a shared host? or is this a dedicated server?
What you're looking for is the robots.txt file though, and you want to add some lines like this:
User-agent: * Disallow: Crawl-Delay: 10 User-agent: ia_archiver Disallow: / User-agent: Ask Jeeves Crawl-Delay: 120 User-agent: Teoma Disallow: /html/ Crawl-Delay: 120
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Best practices when merging 2 domains with different themes and CMS's?
I have a client with 2 sites - one for an external audience and one for their ~2,000-3,000 employees. The external site (call it acme.com), built on WP with a custom theme, is pretty small. The internal site (call it acmeinternal.com) has TONS of high quality content with incredible engagement metrics, but it's built on a separate CMS with an entirely different custom theme. The problem we're trying to solve now: Can we bring the internal site over to the external domain (acme.com and acme.com/internal, for example) so that client.com can benefit from the quantity and quality of content and behavioral metrics associated with the internal content? The external and internal audiences, and the corresponding content for each, are both entirely mutually exclusive. A potential client of theirs who would come to acme.com would have no reason to visit acme.com/internal (we'd actually prefer to not provide navigation to it for them), and the internal audience would treat acme.com/internal as their landing page, and all the posts would then live at acme.com/internal/news/post-name. I'm assuming there are reasons why we couldn't have half of the site on one template using one CMS, having certain SEO tags, certain HTML structure, etc where the other half of the site is using a completely different template with a different CMS with different SEO tags, different URL structure etc? To reap the reward of the great content, would we have to essentially recreate the internal site's content on the external site's cms and template? Is it even possible for the domain authority of acme.com to improve based on the engagement on acme.com/internal/_xxxx _if there's virtually zero linking back and forth between acme.com and /internal/? Any advice would be much appreciated!
Technical SEO | | ThinkAOR0 -
Will Google crawl and rank our ReactJS website content?
We have 250+ products dynamically inserted and sorted on our site daily (more specifically our homepage... yes, it's a long page). Our dev team would like to explore rendering the page server-side using ReactJS. We currently use a CDN to cache all the content, which of course we would like to continue using. SO... will Google be able to crawl that content? We've read some articles with different ideas (including prerendering): http://andrewhfarmer.com/react-seo/
Technical SEO | | Jane.com
http://www.seoskeptic.com/json-ld-big-day-at-google/ If we were to only load the schema important to the page (like product title, image, price, description, etc.) from the server and then let the client render the remaining content (comments, suggested products, etc.), would that go against best practices? It seems like that might be seen as showing the googlebot 1 version and showing the site visitor a different (more complete) version.0 -
What's the best way to pass link juice to a page on another domain?
I'm working with a non-profit, and their donation form software forces them to host their donation pages on a different domain. I want to attempt to get their donation page to appear in their sitelinks in Google (under the main website's entry), but it seems like the organization's donation forms are at a disadvantage because they're not actually hosted on that site. I know that no matter what I do, there's no way to "force" a sitelink to appear the way I want it, but... I was trying to think if there's a way I can work around this. Do you think 1) creating a url like orgname.org/donate and having that be a 301 redirect to the donation form, and 2) using the /donate redirect all over the site (instead of linking directly to the form) would help? Are there alternatives other folks recommend?
Technical SEO | | clefevre0 -
What's the correct SEO for a Gallery?
Hi there, I was wondering if anyone was an expert on galleries and using canonical URL's? URL: http://www.tecsew.com/gallery In short I'm doing SEO for a site and it has a large gallery (3000+ images) where each specific image has it's own page and each category (there's 200+) also has its own page. Now, what I'm thinking is that this should be reduced and asking Google to index/rank each page is wrong (I also think this because the quality of the pages are relatively low i.e little text & content etc) Therefore, what should be suggested/done to the gallery? Should just the main gallery categories get indexed (i.e http://www.tecsew.com/3d-cad-showcase)? Or should I continue to allow Google to trawl through all of it? Or should canonical URL's be used? Any help would be greatly appreciated. Best Wishes, Charlie S
Technical SEO | | media.street0 -
What's the best canonicalization method?
Hi there - is there a canonicalization method that is better than others? Our developers have used the
Technical SEO | | GBC0 -
Wordpress & use of 'www' vs not for webmaster tools - explanation needed
I am having a hard time understanding the issue of canonization of site pages, specifically in regards to the 'www' or 'non-www' versions of a site. And specifically in regards to wordpress. I can see that it doesn't matter whether you type in 'www' or not in the url for a wordpress site, what is going on in the back end that allows this? When I link up to google webmaster tools, should i use www or not? thanks for any help d
Technical SEO | | dnaynay0 -
What's the issue?
Hi, We have a client who dropped in the rankings (initially from bottom of the first page to page to page 3, and now page 5) for a single keyword (their most important one - targeted on their homepage) back in the middle of March. So far, we've found that the issue isn't the following: Keyword stuffing on the page External anchor text pointing to the page Internal anchor text pointing to the page In addition to the above, the drop didn't coincide with panda or penguin. Any other ideas as to what could cause such a drop for a single keyword (other related rankings haven't moved). We're starting to think that this may just have been another small change in the algorithm but it seems like too big of a drop in a short space of time for that to be the case. Any thoughts would be much appreciated! Thanks.
Technical SEO | | jasarrow0 -
Google Webmaster tools vs SeoMOZ Crawl Diagnostics
Hi Guys I was just looking over my weekly report and crawl diagnostics. What I've noticed is that the data gathered on SeoMoz is different from Google Webmaster diagnostics. The number of errors, in particular duplicate page titles, content and pages not found is much higher that what google webmaster tools is represents. I'm a bit confused and don't know which data is more accurate. Please Help
Technical SEO | | Tolod0