Why might my websites crawl rate....explode?
-
Hi Mozzers,
I have a website with approx 110,000 pages. According to search console, Google will usually crawl, on average, anywhere between 500 - 1500 pages per day. However, lately the crawl rate seems to have increased rather drastically:
9/5/16 - 923
9/6/16 - 946
9/7/16 - 848
9/8/16 - 11072
9/9/16 - 50923
9/10/16 - 60389
9/11/16 - 17170
9/12/16 - 79809I was wondering if anyone could offer any insight into why may be happening and if I should be concerned?
Thanks in advance for all advice. -
Thank you Thomas.
-
Just to add to this, there is nothing inherently wrong with Google crawling more pages of your site. The only time I would modify the crawl rate is when the extra crawling is actually slowing your server down.
-
Hi There,
The crawl rate control was devised by Google to give control to the users, so that they can limit the server load that is created by constant crawling of the website.
So, it's up to you to decide whether you want to lower/limit it.
https://support.google.com/webmasters/answer/48620?hl=en
Thanks,
Vijay
-
Thank you Vijay, your response is very helpful. Do you know if there are any guidelines for optimal crawl rates? I tend to look at average pages crawled per day and multiply by 90. If that number is equal to or more than the amount of pages on-site, then we'd be good, right? Or is there a flaw in that logic?
-
Hi Thomas,
Thank you for responding. Yes, kind of. There are 40 main categories and each of those has upto 100 links to sub categories, and then the same again for sub-sub categories.
I've spent the last year cleaning it up and removing pages that didnt need to be there. Quite a lot of pages! In order to help Google find and index the important ones.
I will run it through Screaming Frog now, just to be sure!
-
hi There,
The following can be reasons for your crawl rate increase
- You have updated the content of the website recently or doing it regularly.
- You / someone from your end submitted the sitemap.xml to google again or doing it over and over.
- Your robots.txt was changed to give access to earlier blocked pages.
- Your or someone used ping services to let search engines know about your website. There are many manual ping services like Pingomatic and in the WordPress you can manually add more ping services to ping many search engine bots. You can find such a list at WordPress ping list post (http://www.shoutmeloud.com/wordpress-ping-list.html).
- You can also monitor and optimize Google Crawl rate using Google Webmaster Tools. Just go to the crawl stats there and analyze. You can manually set your Google crawl rate and increase it to faster or slower. Though I would suggest use it with caution and use it only when you are actually facing issues with bots not crawling your site effectively. You can read more about changing Google crawl rate here https://support.google.com/webmasters/?hl=en&answer=48620#topic=3309469 .
I hope this helps, if you have further queries , feel free to respond.
Regards,
Vijay
-
I wouldn't be concerned at all, have you got one section that expands into a load of other links? It could be that Google hasn't crawled properly for a while and then finds a section they haven't seen before and just goes mad.
Alternatively, have you crawled with screamingfrog or similar tool? Incase there's an issue you weren't aware of.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
I have conflicting user flow and bounce rates Google Analytics
Has anyone come across this or know how to read this data. In my site content landing page report I have 141 sessions at 0.71% bounce rate = 1 session But then in user flow I have the same page with 123 sessions with a drop off of 117 sessions???
Reporting & Analytics | | Lucas_SOS0 -
Google Webmaster Tools, about multiple entries for your website
Hi I have a doubt about Google Webmaster Tools or Central as it is call today. I remember that google recommended to have one profile of your website for each domain structure. Let me try to be more clear one profile for http://www.yoursite.com, an other for http://yoursite.com, an other for https://www.yoursite.com, etc. Then in each of them we uploaded our sitemaps and cross our fingers. Now from my experience always the complete url have better index status from the sitemap. Now my question is, today as Google requested all our websites run under https, so conserving the other profiles is affecting how google index our pages? shall we have to delete the old profiles or is better to maintain them? Thanks. Pablo
Reporting & Analytics | | FWC_SEO0 -
How to check bounce rate of mobile or ipad checkout steps? How to check Funnel for mobile or ipad?
Hello Guys, Is there any way to check bounce rate of mobile or ipad checkout steps till thankyou page? Also separate Funnel visualization for mobile or ipad? John
Reporting & Analytics | | varo0 -
Is there an automated way to determine which pages of your website are getting 0 traffic?
I'm doing a content audit on my company website and want to identify pages with zero traffic. I can use GA for low traffic, but not zero traffic. I can do this manually, but it would take a long time. Are there any tools to help me determine these pages?
Reporting & Analytics | | Ksink0 -
Tracking time spent on a section of a website in Google Analytics
Hi, I've been asked by a client to track time spent or number of pages visited on a specific section of their website using Google Analytics but can't see how to do this. For example, they have a "golf" section within their site and want to measure how many people either visit 5 page or more within the golf section or spend at least 6 minutes browsing the various golf section pages. Can anyone advise how if this can be done, and if so, how I go about it. Thanks
Reporting & Analytics | | geckonm0 -
*Dramatic* reduction in bounce rate, why?
Hi all I cannot pin this down to one of -new theme using Thesis 2 and Social Triggers, or -implementing Moz Does the Moz crawler linger on page? I'd love to know why this is happening 7iSnNfC
Reporting & Analytics | | TimMarsh0 -
Do i need a new dedicated server to increase my website speed
Hi, i have been talking to my hosting company about my site. I am having major problems with the speed of the site. My site is www.in2town.co.uk Ever since i had to redesign my site after a major mistake was made by the hosting company, my site has been running slow and i have tried everything to sort this out including moving to a dedicated server. The trouble is nothing is working and now my hosting companny have told me that i need a new dedicated server which will make it faster. My site is in joomla and the hosting company have told me that the dedicated server below will make the site run faster, but shall i trust them or find another hosting company. Intel i3 540 3.06 Ghz HT 4MB S-Cache $219/mo $289/month500GBStorage6GBRAM10TBBandwidthI am using the following to test the speed of my site http://tools.pingdom.com/fpt/#!/r0spOGObd/www.in2town.co.uk and http://gtmetrix.com/reports/www.in2town.co.uk/kVV1mTDcThe trouble i have is, when you try loading the home page it is slow and when you try moving around the site it is slow.Can anyone please give me some advice.
Reporting & Analytics | | ClaireH-1848860 -
Conversion rates by browser & OS - any feedback/experts/experience?
Hi, Ive been evaluating conversion rates by operating system and by browser for a client. Ive picked up significant and somewhat disturbing trends. As you'd expect the bulk of traffic is coming from a Windows/Internet Explorer combination. This is unfortunately one of the worst combinations (Windows/Firefox & Windows/Safari did worse. Chrome/Windows was significantly the best combination with Windows). Windows also performs much worse than Mac. E.g. Windows/Firefox performs worse than Mac/Firefox. Overall conversion rate for Mac is 7.07% compared to 5.69% Windows. This is based on hundreds of thousands of visits and equates to tens of thousands of dollars difference in revenue. Generally later versions of browsers perform better on both main operating systems e.g IE 9.0 converts at 6.33% compared to 8.0 at 5.80% on Windows and Firefox 4.01 on the Mac converts at 7.57% compared to 3.6.16 at 6.54% (although this dataset is smaller than Windows/IE). Page load speeds (recorded in the clients analytics) are significantly faster on Mac than Windows (as expected really). Being Windows/IE and specifically Windows IE8 represents the bulk of traffic should we be addressing this? Will any optimisation negatively affect better performing Mac/Browser combinations? Understanding that Mac users equate to 'better' converting visitors - what else could be done there? Anyone have thoughts or experience on optimising pages for improved conversion rates via IE and Windows? Thanks in advance, Andy
Reporting & Analytics | | AndyMacLean0