Would you rate-control Googlebot? How much crawling is too much crawling?
-
One of our sites is very large - over 500M pages. Google has indexed 1/8th of the site - and they tend to crawl between 800k and 1M pages per day.
A few times a year, Google will significantly increase their crawl rate - overnight hitting 2M pages per day or more. This creates big problems for us, because at 1M pages per day Google is consuming 70% of our API capacity, and the API overall is at 90% capacity. At 2M pages per day, 20% of our page requests are 500 errors.
I've lobbied for an investment / overhaul of the API configuration to allow for more Google bandwidth without compromising user experience. My tech team counters that it's a wasted investment - as Google will crawl to our capacity whatever that capacity is.
Questions to Enterprise SEOs:
*Is there any validity to the tech team's claim? I thought Google's crawl rate was based on a combination of PageRank and the frequency of page updates. This indicates there is some upper limit - which we perhaps haven't reached - but which would stabilize once reached.
*We've asked Google to rate-limit our crawl rate in the past. Is that harmful? I've always looked at a robust crawl rate as a good problem to have.
- Is 1.5M Googlebot API calls a day desirable, or something any reasonable Enterprise SEO would seek to throttle back?
*What about setting a longer refresh rate in the sitemaps? Would that reduce the daily crawl demand? We could set increase it to a month, but at 500M pages Google could still have a ball at the 2M pages/day rate.
Thanks
-
I agree with Matt that there can probably be a reduction of pages, but that aside, how much of an issue this is comes down to what pages aren't being indexed. It's hard to advise without the site, are you able to share the domain? If the site has been around for a long time, that seems a low level of indexation. Is this a site where the age of the content matters? For example Craigslist?
Craig
-
Thanks for your response. I get where you're going with that. (Ecomm store gone bad.) It's not actually an Ecomm FWIW. And I do restrict parameters - the list is about a page and a half long. It's a legitimately large site.
You're correct - I don't want Google to crawl the full 500M. But I do want them to crawl 100M. At the current crawl rate we limit them to, it's going to take Google more than 3 months to get to each page a single time. I'd actually like to let them crawl 3M pages a day. Is that an insane amount of Googlebot bandwidth? Does anyone else have a similar situation?
-
Gosh, that's a HUGE site. Are you having Google crawl parameter pages with that? If so, that's a bigger issue.
I can't imagine the crawl issues with 500M pages. A site:amazon.com search only returns 200M. Ebay.com returns 800M so your site is somewhere in between these two? (I understand both probably have a lot more - but not returning as indexed.)
You always WANT a full site crawl - but your techs do have a point. Unless there's an absolutely necessary reason to have 500M indexed pages, I'd also seek to cut that to what you want indexed. That sounds like a nightmare ecommerce store gone bad.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Do Website Engagement Rates Impact Organic Rankings?
Hello Experts, 1) Do Website Engagement Rates Impact Organic Rankings? What does Website Engagement Rates means? Is it a visitor stay on my page long time? Can anyone guide me what things we can include in website engagement rates Specifically for "Ecommerce Sites" ? Can I consider - 1) Reviews 2) Video's 3) Good Images and informative description 4) video's 5) highlighting relevant blog posts at category pages 6) email subscription etc as website engagement? Thanks!
Intermediate & Advanced SEO | | wright3351 -
How does Googlebot evaluate performance/page speed on Isomorphic/Single Page Applications?
I'm curious how Google evaluates pagespeed for SPAs. Initial payloads are inherently large (resulting in 5+ second load times), but subsequent requests are lightning fast, as these requests are handled by JS fetching data from the backend. Does Google evaluate pages on a URL-by-URL basis, looking at the initial payload (and "slow"-ish load time) for each? Or do they load the initial JS+HTML and then continue to crawl from there? Another way of putting it: is Googlebot essentially "refreshing" for each page and therefore associating each URL with a higher load time? Or will pages that are crawled after the initial payload benefit from the speedier load time? Any insight (or speculation) would be much appreciated.
Intermediate & Advanced SEO | | mothner1 -
Ratings Snippets Gone? ( Help! )
Hello We had good traffic from ratings ( stars ) . I have added Offer details in the rich snippets in various currencies - the snippet testing tool likes it , but for some reason the stars on my site have completely dissapeared and been gone for almost a week. I need the offer information in there for google shopping automatic updates and google told me that it's implemented correctly for the shopping part.. but I really don't know what to do about this. Any ideas why would be really appreciated. http://www.return2health.net/yeast-imbalance/threelac-candida-defence/ Thanks 🙂
Intermediate & Advanced SEO | | s_EOgi_Bear0 -
Crawling issue
Hello, I am working on 3 weeks old new Magento website. On GWT, under index status >advanced, I can only see 1 crawl on the 4th day of launching and I don't see any numbers for indexed or blocked status. | Total indexed | Ever crawled | Blocked by robots | Removed |
Intermediate & Advanced SEO | | sedamiran
| 0 | 1 | 0 | 0 | I can see the traffic on Google Analytic and i can see the website on SERPS when i search for some of the keywords, i can see the links appear on Google but i don't see any numbers on GWT.. As far as I check there is no 'no index' or robot block issue but Google doesn't crawl the website for some reason. Any ideas why i cannot see any numbers for indexed or crawled status on GWT? Thanks Seda | | | | |
| | | | |0 -
SEOmoz is only crawling 2 pages out of my website
I have checked on Google Webmaster and they are crawling around 118 pages our of my website, store.itpreneurs.com but SEOmoz is only crawling 2 pages. Can someone help me? Thanks Diogo
Intermediate & Advanced SEO | | jslusser0 -
Correlation Between Domain Authority and Crawl Penetration?
A. Is there a correlation between domain authority and crawl penetration? B. Is there a correlation between domain authority and juice distribution?
Intermediate & Advanced SEO | | AWCthreads0 -
High Bounce Rate
I found something interesting in Google analytic 2 days ago i started getting visitors from a referral source called yndir.com if I click on it , it say the referral path is /adserver/showinterstitial.php there have been 300+ visitors in last 2 days and before that nothing. the site is pushing my bounce rate up. 3 days ago bounce rate was at around 50%, because of this site the bounce rate went up to over 75% already in 2 days... is this some sort of attack , i don't understand? when i go to yndir.com it says its yahoo directory? weird
Intermediate & Advanced SEO | | raybiswa0 -
Why specify robots instead of googlebot for a Panda affected site?
Daniweb is the poster child for sites that have recovered from Panda. I know one strategy she mentioned was de-indexing all of her tagged content, fo rexample: http://www.daniweb.com/tags/database Why do you think more Panda affected sites specifying 'googlebot' rather than 'robots' to capture traffic from Bing & Yahoo?
Intermediate & Advanced SEO | | nicole.healthline0