Crawl rate
-
Hello,
In google WMT my site has the following message.
<form class="form" action="/webmasters/tools/settings-ac?hl=en&siteUrl=http://www.prom-hairstyles.org/&siteUrl=http://www.prom-hairstyles.org/&hl=en" method="POST">Your site has been assigned special crawl rate settings. You will not be able to change the crawl rate.Why would this be?A bit of backgound - this site was hammered by Penguin or maybe panda but seems to be dragging itself back up (maybe) but has dropped from several thousand visitors/day to 100 or so.Cheers,Ian</form>
-
OK thanks.
Ian
-
There is nothing negative about the message. It is common and expected for sites which use CDNs.
-
Thanks for replying, I have been using a CDN (up until a week ago) and no there haven't been any issues that I can put down to this, I just hadn't seen a message like that before and wondered if it was significant.
-
What is your hosting setup? Are you using some form of cloud hosting such as Amazon or have a CDN set up?
Generally speaking, that message from Google is a good thing. It means they have optimized your crawl rate based on your setup. If you feel there is a need to adjust the rate, it can be done. Are you experiencing any issue?
The Panda/Penguin issue would have no relation to your crawl rate settings.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Site redesign makes Moz Site Crawl go haywire
I work for an agency. Recently, one of our clients decided to do a complete site redesign without giving us notice. Shortly after this happened, Moz Site Crawl reported a massive spike of issues, including but not limited to 4xx errors. However, in the weeks that followed, it seemed these 4xx errors would disappear and then a large number of new ones would appear afterward, which makes me think they're phantom errors (and looking at the referring URLs, I suspect as much because I can't find the offending URLs). Is there any reason why this would happen? Like, something wrong with the sitemap or robots.txt?
Technical SEO | | YYSeanBrady1 -
I've had a sudden a increase in crawl issues as of yesterday (like 300 from a steady 10, does anyone else have this issue?
the main issue is that it's now indexing both www and http:// - anyone else got this issue or had any changes suddenly on their crawl results?
Technical SEO | | beckyhy0 -
404s effecting crawl rate?
We made a change to our site where we all of a sudden we are creating a large number of 404 pages. Is this effecting the crawl/indexing rate? Currently we've submitted 3.4 million pages, have over 834K indexed but have over and 330K pages not found. Since the large increase in 404s we've noticed a decrease in pages crawled per day. I found this Q & A in Webmasters (http://googlewebmastercentral.blogspot.com/2011/05/do-404s-hurt-my-site.html) but it seems like the 404s should not have an effect. Is this article out of date? What do you think fellow Moz-ers? Is this a problem?
Technical SEO | | JoshKimber0 -
Can Googlebot crawl the content on this page?
Hi all, I've read the posts in Google about Ajax and javascript (https://support.google.com/webmasters/answer/174992?hl=en) and also this post: http://moz.com/ugc/can-google-really-access-content-in-javascript-really. I am trying to evaluate if the content on this page, http://www.vwarcher.com/CustomerReviews, is crawlable by Googlebot? It appears not to be. I perused the sitemap and don't see any ugly Ajax URLs included as Google suggests doing. Also, the page is definitely indexed, but appears the content is only indexed via its original source (Yahoo!, Citysearch, Google+, etc.). I understand why they are using this dynamic content, because it looks nice to an end-user and requires little to no maintenance. But, is it providing them any SEO benefit? It appears to me that it would be far better to take these reviews and simply build them into HTML. Thoughts?
Technical SEO | | danatanseo0 -
Help Crawl friendliness for large site
After watching Rand's video I am trying to think of the best way to make my large site more crawl friendly. Background I have a large site with over 100k product skus and so when you get to a particular page of products there are tons of different refinements and options that help you sort the products. Most of these are noindex followed, but I was wondering if I should be nofollowing the internal links as well in order to keep bots out of those pages and going to the pages that I want them to go too. Is this a good way to handle it? Also, does anyone have good recommendations of links to posts that deal with helping the crawl friendliness of a large site? Thanks!
Technical SEO | | Gordian0 -
Has Google Stopped Listing URLs with Crawl Errors in Webmaster Tools?
I went to Google Webmaster Tools this morning and found that one of my clients had 11 crawl errors. However, Webmaster Tools is not showing which URLs are having experiencing the errors, which it used to do. (I checked several other clients that I manage and they list crawl errors without showing the specific URLs. Does anyone know how I can find out which URLs are experiencing problems? (I checked with Bing Webmaster Tools and the number of errors are different).
Technical SEO | | TopFloor0 -
Crawl Diagnostics Report 500 erorr
How can I know what is causing my website to have 500 errors and how I locate it and fix it?
Technical SEO | | Joseph-Green-SEO0 -
Do we need to manually submit a sitemap every time, or can we host it on our site as /sitemap and Google will see & crawl it?
I realized we don't have a sitemap in place, so we're going to get one built. Once we do, I'll submit it manually to Google via Webmaster tools. However, we have a very dynamic site with content constantly being added. Will I need to keep manually re-submitting the sitemap to Google? Or could we have the continually updating sitemap live on our site at /sitemap and the crawlers will just pick it up from there? I noticed this is what SEOmoz does at http://www.seomoz.org/sitemap.
Technical SEO | | askotzko0