Why might my websites crawl rate....explode?
-
Hi Mozzers,
I have a website with approx 110,000 pages. According to search console, Google will usually crawl, on average, anywhere between 500 - 1500 pages per day. However, lately the crawl rate seems to have increased rather drastically:
9/5/16 - 923
9/6/16 - 946
9/7/16 - 848
9/8/16 - 11072
9/9/16 - 50923
9/10/16 - 60389
9/11/16 - 17170
9/12/16 - 79809I was wondering if anyone could offer any insight into why may be happening and if I should be concerned?
Thanks in advance for all advice. -
Thank you Thomas.
-
Just to add to this, there is nothing inherently wrong with Google crawling more pages of your site. The only time I would modify the crawl rate is when the extra crawling is actually slowing your server down.
-
Hi There,
The crawl rate control was devised by Google to give control to the users, so that they can limit the server load that is created by constant crawling of the website.
So, it's up to you to decide whether you want to lower/limit it.
https://support.google.com/webmasters/answer/48620?hl=en
Thanks,
Vijay
-
Thank you Vijay, your response is very helpful. Do you know if there are any guidelines for optimal crawl rates? I tend to look at average pages crawled per day and multiply by 90. If that number is equal to or more than the amount of pages on-site, then we'd be good, right? Or is there a flaw in that logic?
-
Hi Thomas,
Thank you for responding. Yes, kind of. There are 40 main categories and each of those has upto 100 links to sub categories, and then the same again for sub-sub categories.
I've spent the last year cleaning it up and removing pages that didnt need to be there. Quite a lot of pages! In order to help Google find and index the important ones.
I will run it through Screaming Frog now, just to be sure!
-
hi There,
The following can be reasons for your crawl rate increase
- You have updated the content of the website recently or doing it regularly.
- You / someone from your end submitted the sitemap.xml to google again or doing it over and over.
- Your robots.txt was changed to give access to earlier blocked pages.
- Your or someone used ping services to let search engines know about your website. There are many manual ping services like Pingomatic and in the WordPress you can manually add more ping services to ping many search engine bots. You can find such a list at WordPress ping list post (http://www.shoutmeloud.com/wordpress-ping-list.html).
- You can also monitor and optimize Google Crawl rate using Google Webmaster Tools. Just go to the crawl stats there and analyze. You can manually set your Google crawl rate and increase it to faster or slower. Though I would suggest use it with caution and use it only when you are actually facing issues with bots not crawling your site effectively. You can read more about changing Google crawl rate here https://support.google.com/webmasters/?hl=en&answer=48620#topic=3309469 .
I hope this helps, if you have further queries , feel free to respond.
Regards,
Vijay
-
I wouldn't be concerned at all, have you got one section that expands into a load of other links? It could be that Google hasn't crawled properly for a while and then finds a section they haven't seen before and just goes mad.
Alternatively, have you crawled with screamingfrog or similar tool? Incase there's an issue you weren't aware of.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Need your Opinion on Bounce Rate Analysis
I'm currently doing a bounce rate analysis for our resource pages. These are information article pages - mix of plain texts and those containing either images, infographics, videos or even podcasts. By the way, I did search for bounce rate topics here, but I felt like i still need to post this. Unless I've overlooked a similar post, my apologies. It's a first for me to do an in-depth BR analysis, so I need to clarify few things. What is a good or bad range bounce rate? Is there even a range comparison? Like when can you say a bounce rate is high for an information type page? I've read some stuff online but they're confusing. What other Analytics factors should I consider looking at together with bounce rate? For pages (which purposely educate visitors) with high bounce rate, can you guys suggest tips to improve it? I would appreciate and value any advise. Thanks a lot!
Reporting & Analytics | | ktrich1 -
Has GA changed how it calculates bounce rate?
Updated website from Magento backend to Wordpress and since the redeploy, my bounce rate has dropped from 40% to 4%. I did check to see if there was duplicate GA tracking code but only one call. I checked reports under "visitor flow" and one page visits seem inline with the previous bounce rate. Weird. Never experienced this before.
Reporting & Analytics | | Timmmmy0 -
Reach local driving up bounce rate...
Hi all! I have a new client that I did a website for. After a month, looking at the analytics, it shows that while the site visits from reach local is more than the organic google, the reachlocal traffic is bouncing, causing the overall website bounce rate to skyrocket. Organic bounce rate is 47.62% and the reachlocal is at 84.25% driving the overall bounce rate to 68! Duration of the reachlocal traffic is at :56 vs 3:41 for organic. (SEE ATTACHED IMAGE) I'm guessing this all means that the reachlocal traffic is obviously not quality, so does that mean they are targeting non-relevant keywords? I don't have any experience dealing with reachlocal. Should I recommend my client to drop it? And if so, how to stop that traffic from coming to the site? I'm sure this is an easy one for you pros! Thanks! ~BB MUW959h.jpg
Reporting & Analytics | | BBuck0 -
My GWT tells me that verification has failed numerous occasions - will this stop my site being crawled?
I launched www.over50choices.co.uk 6 weeks ago and have had trouble with google indexing and crawling all pages. It tells me 143 submitted & 129 Indexed, but the site has 166 pages? It still shows the old home page image in GWT - which is v annoying! Whilst the site is verified by GA & HTML Tag, it tells me in the Verification section that "reverification failed" on numerous occasions - they seem correspond with when google trys to process the site map. Is this a coincidence ie verification fails when its trying to process the site map, which in turn is leaving me with an out of date site map and therefore not all my pages submitted or crawled? Or will this not effect the googles ability to crawl the site? Your help please. Ash
Reporting & Analytics | | AshShep10 -
2 days in the past week Google has crawled 10x the average pages crawled per day. What does this mean?
For the past 3 months my site www.dlawlesshardware.com has had an average of about 400 pages crawled per day by google. We have just over 6,000 indexed pages. However, twice in the last week, Google crawled an enormous percentage of my site. After averaging 400 pages crawled for the last 3 months, the last 4 days of crawl stats say the following. 2/1 - 4,373 pages crawled 2/2 - 367 pages crawled 2/3 - 4,777 pages crawled 2/4 - 437 pages crawled What is the deal with these enormous spike in pages crawled per day? Of course, there are also corresponding spikes in kilobytes downloaded per day. Essentially, Google averages crawling about 6% of my site a day. But twice in the last week, Google decided to crawl just under 80% of my site. Has this happened to anyone else? Any ideas? I have literally no idea what this means and I haven't found anyone else with the same problem. Only people complaining about massive DROPS in pages crawled per day. Here is a screenshot from Webmaster Tools: http://imgur.com/kpnQ8EP The drop in time spent downloading a page corresponded exactly to an improvement in our CSS. So that probably doesn't need to be considered, although I'm up for any theories from anyone about anything.
Reporting & Analytics | | dellcos0 -
If I am changing my domain for my website and want to keep using the same Google Analytics account to keep the data from the old domain. How should I proceed?
If I am changing my domain for my website and want to keep using the same Google Analytics account to keep the data from the old domain. How should I proceed? Do I have to start a new Google Analytics account for the new domain? If so how do I keep the old data? Or can I use the same GA account? Thank you.
Reporting & Analytics | | brianhughes1 -
Re-running Crawl Diagnostics
I have made a bunch of changes thanks to the Crawl Diagnostics Tool but now need to re-run as I have lost where I started and what still needs to be done. How do I re-run the crawl diagnostic tool?
Reporting & Analytics | | Professor1 -
Site crawler hasn't crawled my site in 6 days!
On 4.23 i requested a site crawl. My site only has about 550 pages. So how can we get faster crawls?
Reporting & Analytics | | joemas990