Should I set a max crawl rate in Webmaster Tools?
-
We have a website with around 5,000 pages and for the past few months we've had our crawl rate set to maximum (we'd just started paying for a top of the range dedicated server at the time, so performance wasn't an issue).
Google Webmaster Tools has alerted me this morning that the crawl rate has expired so I'd have to manually set the rate again. In terms of SEO, is having a max rate a good thing?
I found this post on Moz, but it's dated from 2008. Any thoughts on this?
-
At first I assumed that by manually setting the crawl rate to the maximum, Google would crawl my website faster and more frequently. Our website has tens of thousands of pages so I didn't want Google missing any of it or taking a long time to index new content. We have new products added to the website daily and others that come off or change.
I'll let Google decide
-
Yep, they're a little vague here! But the answer is: Google will crawl your site at whatever rate it wants (it's probably crawling Amazon 24/7), unless you limit how much it can crawl in Google Webmaster Tools. Then, Google will crawl your site at whatever rate it wants, unless than rate is higher than the limit you put in, and then it will limit itself.
If you're anxious for Google to crawl your site more because a) you have something that's changed and you want Google to have it in their index, or b) because you're hoping it'll affect your rankings:
a) If there's specific information that you want Google to update its index with, submit the URL of the page that's new or changed into "Fetch as Googlebot" and then, once you fetch it, hit the "Submit to index" button to the right. I work on a site that's a DA 58 and fetching something as Googlebot updates the index within an hour.
b) How much Google crawls your site has to do with how important your site is; forcing Google to crawl your site more will not make it think your site is more important.
Hope this helps!
Kristina
-
Is selecting "Limit Google's maximum crawl rate" and then manually moving the rate to the highest (0.2 requests per second / 5 seconds between requests) a higher rate than selecting "Let Google optimize for my site (recommended)"? Google don't really expand on this! I want them to crawl at the very maximum but they don't tell us how many requests per second and seconds between requests are involved when selecting the optimized option.
-
You don't need to. Just let Google crawl at will. The only reason you would want to limit the crawl rate is if you're having performance issues from the server you're on (too much traffic at once). If you're not having any issues, then allow Google to crawl as many pages as they can.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Site Speed Testing Tools For Production Sites
Hi Guys, Any free site speed testing tools for sites in production, which are password protected? We want to test site speed before the new site goes live on top priority pages. Site is on Shopify – we tried google page insights while being logged into the production site but believe its just recording the speed of the password page. Cheers.
Intermediate & Advanced SEO | | brandonegroup1 -
What to do when you have maxed out GTM container
Unfortunately, we have maxed out our capacity for our GTM container because we are using the same container for many websites. We need the ability to add new tags though without deleting old ones. As a temporary patch, we are creating a second container to push to all sites so we can run both containers simultaneously. Our biggest concern with this is slowing down page load speed because of having all tags from both containers firing on pages. What are the issues with having multiple containers on our sites? What would be the best way to do this long term?
Intermediate & Advanced SEO | | MJTrevens0 -
Completely redesigned webmaster - set up new site in Google Webmaster Tools, or keep existing??
Hi - our company just completely redesigned our website and went from a static HTML site to a PHP based site, so every single URL has changed (around 1500 pages). I put the same verification code into the new site and re-verified but now Google is listing tons and tons of 404's. Some of them are really old pages that haven't existing in a long time, it would literally be impossible to create all the redirects for the 404s it's pulling. Question - when completely changing a site like this, should I have created a whole new Search Console? Or did I do the right thing by using the existing one?
Intermediate & Advanced SEO | | Jenny10 -
Hreflang for Canadian web visitors (when their browsers are set to en-us)
We're in the process of implementing hreflang markup for Canadian & US versions of a website. We've found that about half of our Canadian traffic has browsers that are set to en-us (instead of en-ca, as would be expected). Should we be concerned that Canadians with en-us browser settings will be shown the US versions of the website (as the hreflang would markup 'en-us' for the US version of the page). Our immediate thoughts are that since they're likely to be searching from Google.ca and would also have Canadian IP addresses, that this won't be an issue. Does anyone have any other thoughts here?
Intermediate & Advanced SEO | | ATMOSMarketing560 -
What URL parameter settings in GWT to choose for search results parameter?
Hello,we're about to disallow search results from crawling in robots.txt, but in GWT we have to specify URL parameters. URLs with 'search' parameter look like these: http://www.example.com/?search=keyword So in GWT we're setting the following parameter: search Question, what settings to set for it?
Intermediate & Advanced SEO | | poiseo0 -
Best Tool For Finding Related Keywords?
What is the best tool for finding related keywords to the primary keyword we are targetting? Cheers
Intermediate & Advanced SEO | | webguru20140 -
XML Sitemap Indexation Rate Decrease
On September 28th, 2013 I saw my indexation rate decrease on my XML sitemap that I've submitted through GWT. I've since scraped my sitemap and removed all 404, 400 errors (which only made up ~5% of the entire sitemap). Any idea why Google randomly started indexing less of my XML sitemap on that date? I updated my sitemap 2 week before that date and had an indexation rate of ~85% - no I'm below 35%. Thoughts, idea, experiences? Thanks!
Intermediate & Advanced SEO | | RobbieWilliams0 -
New to SEO. How do I set up a 301 Redirect? What Else should I do?
Hello, I am new to web design. I designed my own site using dreamweaver and did all my seo on my own, I read a few books. Long story short I rank on the bottom of page 1 just after 3 months and the keywords are highly competitive. Now, I am up against some heavy hitters from national brands versus my local real estate site. I don't have a 301 redirect, and am not sure what else I should be doing to get my site ranked higher. I have back links from various sites, ( non-paid ) so it's what others call white hat. When I grade my site on website grader I get a great score versus the sites that are higher than me. I'm guessing my sites age is an issue. I guess I'm looking for some guidance. Thank you all, Here is my site to view. http://www.bronxpad.com
Intermediate & Advanced SEO | | bronxpad0