Googlebot Crawl Rate causing site slowdown
-
I am hearing from my IT department that Googlebot is causing as massive slowdown/crash our site. We get 3.5 to 4 million pageviews a month and add 70-100 new articles on the website each day. We provide daily stock research and marke analysis, so its all high quality relevant content. Here are the crawl stats from WMT:
I have not worked with a lot of high volume high traffic sites before, but these crawl stats do not seem to be out of line. My team is getting pressure from the sysadmins to slow down the crawl rate, or block some or all of the site from GoogleBot.
Do these crawl stats seem in line with sites? Would slowing down crawl rates have a big effect on rankings?
Thanks
-
Similar to Michael, my IT team is saying Googlebot is causing performance issues - specifically during peak hours.
It was suggested that we consider using apache re-write rules to serve Googlebot a 503 during our peak hours to limit the impact. I found the stackoverflow thread (link below) in which John Muller seems to suggest this approach, but has anyone tried this?
-
Blocking googlebot is a quick and easy way to disappear from the Index. Not an option if you want Google to rank your site.
For smaller sites or ones with limited technologies, I sometimes recommend using a crawl-delay directive in robots.txt
http://support.google.com/webmasters/bin/answer.py?hl=en&answer=48620
But I agree with both Shane and Zachary, this doesn't seem like the long term answer to your problems. Your crawl stats don't seem out of line for a site of your size, and perhaps a better hardware configuration could help things out.
With 70 new articles each day, I'd want Google crawling my site as much as they pleased.
-
whatever Google's default is in GWT - It sets it for you.
You can change it, but it is not reccomended unless for a specific reason (such as Michael Lewis's specific scenario) even though, I am not completely sold that Gbot is what is causing the "dealbreaking" overhead.
-
what is the ideal setting on the crawler. i have been wondering about this for some time.
-
Hi,
Your admins saying that, is like someone saying "we need to shut the site down, we are getting to much traffic!" Common sys-admin response (fix it somewhere else)
4GB a day downloaded, is alot of Bot traffic, but it appears you are a "real time" site, that is probably actually helped and maybe even reliant on your high crawl rate....
I would upgrade hardware - or even look into some kind of off site cloud redundancy for failover (Hybrid)
I highly doubt that 4GB a day, is a "dealbreaker",but of course that is just based off the one image, and your admins probably have resource monitors - Maybe Varnish is an answer for static content to help lighten load???? Or CDN for file hosting to lighten bandwidth load?
Shane
-
We are hosting the site on our own hardware at a big colo. I know that we are upgrading servers but they will not be online until the end of July.
Thanks!
-
I wouldn't slow the crawl rate. A high crawl rate is good so that Google can keep their index of your website current.
The better solution is to reconsider your hardware and networking setup. Do you know how you are being hosted? From my own experience with a website of that size, a load balancer on two decent dedicated servers should handle the load without problems. Google crawling your pages shouldn't create noticeable overhead on the right setup.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Linking to my Site so I should Link Back?
I remember hearing a few years ago that it was a good practice to link back to a site that was linking to you. My company's site was referenced and linked to in a news article. The news company has an above average domain authority, which is pretty good for my company's backlink profile. Is it still or was ever a "best practice" to link back to this website/domain? I feel like linking back was a best practice, but when I try to search this, all I get back is backlinking 101 and backlinking articles. Nothing really answering my question straight forward. Thanks for any help.
Technical SEO | | aua0 -
Mobile site not ranking
Hello, I have a m.site.com version of my original site. It is about 1/10 the size, and no matter what I do-I can't get the site to rank. I've added more pages and specified canonical etc etc. Should I add as many pages as my larger site has? Are there specific places I should be submitting this version beyond the typical? I am at a loss, so any help would be greatly appreciated! Thanks! L
Technical SEO | | lfrazer1 -
Redesigning the site with same Domain (IMP.)
technical SEO question - If we take down a site and use the same domain but just redesign the whole site. I guess sometimes in this case Google still keeps indexing old pages though they do not exist now! What the solution for this? Google suggests redirect them to a 404 page but in this case as its same domain- Is it possible that we throw 404 errors and redirect them to 404 page and this 404 page exists in the new site itself (but of course we don't have link our menu to this 404 page) (if that makes sense)? Would appreciate if you can suggest or add anything to above topic.
Technical SEO | | Personnel_Concept0 -
Crawling a subfolder with a dev site
I am trying to set up a campaign where I am crawling a subfolder of our main site where I have dev version of the new site. However, even though the new site resolves and I have included the full resolving URL but the crawl results come back saying that only one page has been crawled. The site has had a protected block on it for a period of time but this has now been removed. Any ideas? Thanks Nick
Technical SEO | | Total_Displays0 -
What to do next with my site gamblingsites.co
So I have this site gamblingsites.co, which I launched about a year ago (I think.) This used to be internetgamblingsites.net (a domain I bought, but never managed to get in the index, and it appeared to violate the T/Cs after asking in GWMT) and before that the site used to be casinowarehouse.eu. After moving to gamblingsites.co, the pages were indexed almost instantly. I kept a 301 in place until today as I had some links pointing to internetgamblingsites.net. Now, until a few weeks ago, everything was fine. The site was ranking top 10 for gambling sites (8-10) and I had some traffic everyday. This site wasn't my top priority, so besides adding new unique content, I didn't do much with it. In each case no shady link building or what-so-ever. On February first of this year, however, it lost all of its rankings, and I have no idea why. Much worse site appear in the top 50, where a sub page of my site appears somewhere on the 9th SERP for keyword 'gambling sites.' Last week I started contacting some people and asked them to update my links. I also used my own sites (all on unique hosting accounts) to build some branded links, i.e. 'GamblingSites.co' and similar terms to down tune the exact match. I also decreased the instances of the exact match on the homepage, to avoid over optimization. Finally, I removed the 301 from internetgamblingsites.net, since the better links have been changed (or are about to get changed soon.) Now, couple of days later... no changes, but it's probably to early to judge. My question to you: "What would you do next, to try to save the site and at least get some traffic to it?" Thank you for your help, Giorgio PS: Feel free to ask for more information.
Technical SEO | | VisualSense0 -
Penalities in a brand new site, Sandbox Time or rather a problem of the site?
Hi guys, 4 weeks ago we launched a site www.adsl-test.it. We just make some article marketing and developed a lots of functionalities to test and share the result of the speed tests runned throug the site. We have been for weeks in 9th google serp page then suddendly for a day (the 29 of february) in the second page next day the website home is disappeared even to brand search like adsl-test. The actual situalion is: it looks like we are not banned (site:www.adsl-test.it is still listed) GWT doesn't show any suggestion and everything looks good for it we are quite high on bing.it and yahoo.it (4th place in the first page) for adsl test search Anybody could help us to understand? Another think that I thought is that we create a single ID for each test that we are running and these test are indexed by google Ex: <cite>www.adsl-test.it/speedtest/w08ZMPKl3R or</cite> <cite>www.adsl-test.it/speedtest/P87t7Z7cd9</cite> Actually the content of these urls are quite different (because the speed measured is different) but, being a badge the other contents in the page are pretty the same. Could be a possible reason? I mean google just think we are creating duplicate content also if they are not effectively duplicated content but just the result of a speed test?
Technical SEO | | codicemigrazione0 -
URL rewriting causing problems
Hi I am having problems with my URL rewriting to create seo friendly / user friendly URL's. I hope you follow me as I try to explain what is happening... Since the creation of my rewrite rule I am getting lots of errors in my SEOMOZ report and Google WMT reports due to duplicate content, titles, description etc For example for a product detail, it takes the page and instead of a URL parameter it creates a user friendly url of mydomain.com/games-playstation-vita-psp/B0054QAS However in the google index there is also the following friendly URL which is the same page - which I would like to remove domain.com/games-playstation-vita/B0054QAS The key to the rewrite on the above URLs is the /B0054QAS appended at the end - this tells the script which product to load, the details preceeding this could be in effect rubbish i.e. domain.com/a-load-of-rubbish/B0054QAS and it would still bring back the same page as above. What is the best way of resolving the duplicate URLs that are currently in the google index which is causing problems The same issue is causing a quite serious a 5XX error on one of the generated URLs http://www.mydomain.com/retailersname/1 - , if I click on the link the link does work - it takes you to the retailers site, but again it is the number appended at the end that is the key - the retailersname is just there for user friendly search reasons How can I block this or remove it from the results? Hope you are still with me and can shed some light on these issues please. Many Thanks
Technical SEO | | ocelot0 -
How to turn WP site into Ecom site?
I have a couple of old wordpress sites that are old affiliate blogs. I currently sell products that are on amazon on these sites and sell quite a bit of volume. I have found a source and can afford the inventory to replace Amazon with my own product. So the dilema is how to turn these wordpress sites into ecommerce sites. The thing I am worried most about is that each site gets about 100-200 visitors a day for great buying keywords. I obviously don't want to lose my rankings. What are the options of turning a wordpress site into a store. I am not interested in plugins or some of the other solutions that make the store look very cheap and I would assume horribly convert. If you have inner pages ranking for keywords how does that work? Do the post pages become product pages? So to sum up I guess I am asking, what are the options that are of the higher quality that will also help me keep my rankings? Thanks
Technical SEO | | PEnterprises0