Should I set a max crawl rate in Webmaster Tools?
-
We have a website with around 5,000 pages and for the past few months we've had our crawl rate set to maximum (we'd just started paying for a top of the range dedicated server at the time, so performance wasn't an issue).
Google Webmaster Tools has alerted me this morning that the crawl rate has expired so I'd have to manually set the rate again. In terms of SEO, is having a max rate a good thing?
I found this post on Moz, but it's dated from 2008. Any thoughts on this?
-
At first I assumed that by manually setting the crawl rate to the maximum, Google would crawl my website faster and more frequently. Our website has tens of thousands of pages so I didn't want Google missing any of it or taking a long time to index new content. We have new products added to the website daily and others that come off or change.
I'll let Google decide
-
Yep, they're a little vague here! But the answer is: Google will crawl your site at whatever rate it wants (it's probably crawling Amazon 24/7), unless you limit how much it can crawl in Google Webmaster Tools. Then, Google will crawl your site at whatever rate it wants, unless than rate is higher than the limit you put in, and then it will limit itself.
If you're anxious for Google to crawl your site more because a) you have something that's changed and you want Google to have it in their index, or b) because you're hoping it'll affect your rankings:
a) If there's specific information that you want Google to update its index with, submit the URL of the page that's new or changed into "Fetch as Googlebot" and then, once you fetch it, hit the "Submit to index" button to the right. I work on a site that's a DA 58 and fetching something as Googlebot updates the index within an hour.
b) How much Google crawls your site has to do with how important your site is; forcing Google to crawl your site more will not make it think your site is more important.
Hope this helps!
Kristina
-
Is selecting "Limit Google's maximum crawl rate" and then manually moving the rate to the highest (0.2 requests per second / 5 seconds between requests) a higher rate than selecting "Let Google optimize for my site (recommended)"? Google don't really expand on this! I want them to crawl at the very maximum but they don't tell us how many requests per second and seconds between requests are involved when selecting the optimized option.
-
You don't need to. Just let Google crawl at will. The only reason you would want to limit the crawl rate is if you're having performance issues from the server you're on (too much traffic at once). If you're not having any issues, then allow Google to crawl as many pages as they can.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Setting up the right Geo targeting/language targeting settings and not to brake the SEO
Hello the great Moz Community! Gev here from BetConstruct, a leading gaming and betting software provider in the world. Our company website is performing great on SERP. We have 20+ different dedicated pages for our 20+ softwares, event section, different landing pages for different purposes. We also run a blog section, Press section, and more... Our website's default language is EN. 4 months ago we opened the /ru and /es versions of the website! I have set the correct hreflang tags, redirects, etc.. generated correct sitemaps, so the translated versions started to rank normally! Now our marketing team is requesting different stuff to be done on the website and I would love to discuss this with you before implementing! There are different cases! For example: They have created a landing page under a url betconstruct.com/usa-home and want me to set that page as the default website page(ie homepage), if the user visits our website from a US based IP. This can be done in 2 different ways: I can set the /usa-home page as default in my CMS, in case the visitor is from US and the address will be just betconstruct.com(without /use-home). In this case the same URL (betconstruct.com) will serve different content for only homepage. I can check the visitor IP, if he is from US, I can redirect him to betconstruct.com/usa-home. In this case user can click on the logo and go to the homepage betconstruct.com and see the original homepage. Both of the cases seems to be dangerous, because in the 1st case I am not sure what google will think when he sees different homepage from different IPs. And in the 2nd case I am not sure what should be that redirection. Is it 301 or 303, 302, etc... Because Google will think I don't have a homepage and my homepage redirects to a secondary page like /usa-home After digging a lot I realised that my team is requesting from me a strange case. Because the want both language targeting(/es, /ru) and country targeting (should ideally be like /us), but instead of creating /us, they want it to be instead of /en(only for USA) Please let me know what will be the best way to implement this? Should we create a separate version of our website for USA under a /us/* URLs? In this case, is it ok to have /en as a language version and /us as a country targeting? What hreflangs to use? I know this is a rare case and it will be difficult for you to understand this case, but any help will be much appreciated! Thank you! Best,
Intermediate & Advanced SEO | | betconstruct
Gev0 -
Lazy Loading of Blog Posts and Crawl Depths
Hi Moz Fans, We are looking at our blog and improving the content as much as we can for SEO purposes, but we have hit a bit of a blank in terms of lazy loading implications and issues with crawl depths. We introduced lazy loading onto the blog home page to increase site speed initially and it works well with infinite scroll, but we were wondering whether this would cause any issues regarding SEO. A lot of the resources online seem to be conflicting and some are very outdated, so some clarification on what is best in terms of lazy loading and crawl depths for blogs, would be fantastic! I hope someone can help and give us some up to date insights - If you need anymore information, I'll reply ASAP
Intermediate & Advanced SEO | | Victoria_0 -
Decreased organic traffic but increased Webmaster Tool Queries
We have a client who has had a significant decrease in organic traffic this last month (about 20%) but in Webmaster tools it tells me there was an increase in impressions and clicks. How can these both be accurate?
Intermediate & Advanced SEO | | jfeitlinger0 -
Hreflang Tags with Errors in Google Webmaster Tools
Hello, Google Webmaster tools is giving me errors with Hreflang tags that I can't seem to figure out... I've double checked everything: all the alternate and canonical tags, everything seems to match yet Google finds errors. Can anyone help? International Targeting | Language > 'fr' - no return tags
Intermediate & Advanced SEO | | GlobeCar
URLs for your site and alternate URLs in 'fr' that do not have return tags.
Status: 7/10/15
24 Hreflang Tags with Errors Please see attached pictures for more info... Thanks, Karim KQgb3Pn0 -
Should I set up no index no follow on low quality pages?
I know it is a good idea for duplicate pages, blog tags, etc. but I remember somewhere that you can help the overall link juice of a website by adding no index no follow or no index follow low quality content pages of your website. Is it still a good idea to do this or was it never a good idea to begin with? Michael
Intermediate & Advanced SEO | | Michael_Rock0 -
Best way to set up anchor text on parked pages?
Our company is no longer offering a series of products, much to the disappointment of our SEO team since we've spent a long time building up the pages and getting them ranked organically. The pages all have decent page rank and in some cases rank #1 for the primary keyword. We have a sister company that we acquired a year ago and they still offer these products on their website. They are a completely separate company with their own website which existed long before we acquired them and we have nothing to do with their website. Our team has proposed that rather than take down the URLs on our site for the products we no longer offer, to put a message saying something like "sorry we don't offer this anymore but you may be interested in this.." and then link to our sister company with anchor text so that they can get some benefit from our SEO efforts if we can't. The question/issue is how should we do that since there will be a lot of pages from the same domain, about 20 pages, all linking to a few pages on a different domain. Should the anchor text be varied unbranded or branded? On the one hand I think if we change up the anchor text used to link to another page many times from a single domain that looks strange and transparent to google. On the other hand unbranded text would be the better descriptor for users since we are deep linking to the product not the homepage of the other site.
Intermediate & Advanced SEO | | edu-SEO0 -
Showing Duplicate Content in Webmaster Tools.
About 6 weeks ago we completely redid our entire site. The developer put in 302 redirects. We were showing thousands of duplicate meta descriptions and titles. I had the redirects changed to 301. For a few weeks the duplicates slowly went down and now they are right back to where they started. Isn't the point of 301 redirects to show Google that content has permanently been moved? Why is it not picking this up? I knew it would take some time but I am right where I started after a month.
Intermediate & Advanced SEO | | EcommerceSite0 -
SEO and Pictures tool
Hello, I need to share pictures albums. I would like to know if any of you have an opinion on the best tools available to share pictures on the web? When I say 'the best tool' I mean from an SEO perspective. So, based on your experience, is there tools with which I have better chances to get my pictures indexed? Thanks !! Note: CNET has created a great article that present the major players
Intermediate & Advanced SEO | | EnigmaSolution0