Why might my websites crawl rate....explode?
-
Hi Mozzers,
I have a website with approx 110,000 pages. According to search console, Google will usually crawl, on average, anywhere between 500 - 1500 pages per day. However, lately the crawl rate seems to have increased rather drastically:
9/5/16 - 923
9/6/16 - 946
9/7/16 - 848
9/8/16 - 11072
9/9/16 - 50923
9/10/16 - 60389
9/11/16 - 17170
9/12/16 - 79809I was wondering if anyone could offer any insight into why may be happening and if I should be concerned?
Thanks in advance for all advice. -
Thank you Thomas.
-
Just to add to this, there is nothing inherently wrong with Google crawling more pages of your site. The only time I would modify the crawl rate is when the extra crawling is actually slowing your server down.
-
Hi There,
The crawl rate control was devised by Google to give control to the users, so that they can limit the server load that is created by constant crawling of the website.
So, it's up to you to decide whether you want to lower/limit it.
https://support.google.com/webmasters/answer/48620?hl=en
Thanks,
Vijay
-
Thank you Vijay, your response is very helpful. Do you know if there are any guidelines for optimal crawl rates? I tend to look at average pages crawled per day and multiply by 90. If that number is equal to or more than the amount of pages on-site, then we'd be good, right? Or is there a flaw in that logic?
-
Hi Thomas,
Thank you for responding. Yes, kind of. There are 40 main categories and each of those has upto 100 links to sub categories, and then the same again for sub-sub categories.
I've spent the last year cleaning it up and removing pages that didnt need to be there. Quite a lot of pages! In order to help Google find and index the important ones.
I will run it through Screaming Frog now, just to be sure!
-
hi There,
The following can be reasons for your crawl rate increase
- You have updated the content of the website recently or doing it regularly.
- You / someone from your end submitted the sitemap.xml to google again or doing it over and over.
- Your robots.txt was changed to give access to earlier blocked pages.
- Your or someone used ping services to let search engines know about your website. There are many manual ping services like Pingomatic and in the WordPress you can manually add more ping services to ping many search engine bots. You can find such a list at WordPress ping list post (http://www.shoutmeloud.com/wordpress-ping-list.html).
- You can also monitor and optimize Google Crawl rate using Google Webmaster Tools. Just go to the crawl stats there and analyze. You can manually set your Google crawl rate and increase it to faster or slower. Though I would suggest use it with caution and use it only when you are actually facing issues with bots not crawling your site effectively. You can read more about changing Google crawl rate here https://support.google.com/webmasters/?hl=en&answer=48620#topic=3309469 .
I hope this helps, if you have further queries , feel free to respond.
Regards,
Vijay
-
I wouldn't be concerned at all, have you got one section that expands into a load of other links? It could be that Google hasn't crawled properly for a while and then finds a section they haven't seen before and just goes mad.
Alternatively, have you crawled with screamingfrog or similar tool? Incase there's an issue you weren't aware of.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Will adjusted bounce rate affect avg time on page?
I recently read Rob Beirne's piece on how and why to set up an adjusted bounce rate in Google Analytics (https://moz.com/blog/adjusted-bounce-rate). I am getting myself ready to talk to our site team about why we should set up an adjusted bounce rate and am anticipating some questions I'd like to be able to answer: 1. Will an adjusted bounce rate improve the accuracy of our avg time on page metrics? 2. Are we able to keep the unadjusted bounce rate in GA as well, so we can compare the two metrics if we ever need to? Does anyone know the answers to these questions? Any help would be much appreciated!
Reporting & Analytics | | seoisfun0 -
Title Tag Capitalization Impact on SERP Rankings and Click Through Rates
My company made a branding decision to use lowercase for all of our title tags. This, of course, means that our titles on SERPs are all lower case. Overwhelmingly, it seems that websites use title case. This makes me wonder if we're shooting ourselves in the foot. Does using lower case titles negatively impact our rankings and/or click through rates? Is there any data out there suggesting that title case has a better click through rate than lower case? Thanks for reading!
Reporting & Analytics | | Solid_Gold0 -
Google Crawl Stats
Hi all Wondering if anyone could help me out here. I am seeing massive variations in WMT of google crawl stats on a site I run. Just wondering if this is normal (see attached image). The site is an eccommerce site and gets a handful of new products added every couple of weeks. The total no of products is about 220k so this is only a very small %. I notice in WMT I have an amber warning under Server connectivity. About 10 days back I had warnings under DNS, Server and Robots. This was due to bad server performance. I have since moved to a new server and the other two warnings have gone back to green. I expect the Server connectivity one to update any day now. Ive included the graph for this incase it is relevant here. Many thanks for assistance. Carl crawlstats.png connect.png
Reporting & Analytics | | daedriccarl0 -
What about this (google crawl)?
Recently we did a serious effort on SEO with SEO Yoast (Wordpress). And after a few months of tweaking old articles we get this impact on crawl search.. Is this graph normal? s1TQgv9.png
Reporting & Analytics | | noodweerbenelux0 -
Does GWT "Fetch as Google Bot" feature affect crawl rate?
Hello Mozians, I have noticed many people saying using GWT fetch as GoogleBot can affect your crawl rate in future, if used regularly. Though, i am not very sure if this is true or just another stale SEO myth. As currently GWT provides a limit of 500 URLs to fetch every month. I hope my doubts will be cleared by the Moz community experts. Thanks!
Reporting & Analytics | | pushkar630 -
How Google measure website bounce rate ?
Bounce rate is a SEO signal, but how Google measures it ? There is any explanation about this ? Does Google uses Analytics ? Maybe time between 2 clics in search results ? Thanks
Reporting & Analytics | | Max840 -
SEOMoz & Google Webmaster Tools crawl error conflicting info
Site im working on has zero crawl errors according to SEOMoz (it did previously have lots since ironed out) but now looking at GWebmaster Tools saying 5000 errors. Date of those are not that recent but Webmaster Tools line graph of errors still showing aprox 5000 up to yesterday There is an option to bulk action/tick them all as fixed so thinking/hoping GWT just keeping a historical record that can now be deleted since no longer applicable. However i'm not confident this is the case since still showing on the line graph. Any ideas re this anomalous info (can i delete and forget in GWT) ? Also side question I take it its not possible to link a GA property with a GWT account if created with different logins/accounts ? Many Thanks Dan
Reporting & Analytics | | Dan-Lawrence0 -
Pages crawled
Hi I've created a campaign for my own website and added 3 competitor sites. Under the campaign it says that 53 pages have been crawled but my site has less than 10 pages. Are the other pages from my competitor sites? Thanks James
Reporting & Analytics | | avecsys0