Crawl rate
-
Hello,
In google WMT my site has the following message.
<form class="form" action="/webmasters/tools/settings-ac?hl=en&siteUrl=http://www.prom-hairstyles.org/&siteUrl=http://www.prom-hairstyles.org/&hl=en" method="POST">Your site has been assigned special crawl rate settings. You will not be able to change the crawl rate.Why would this be?A bit of backgound - this site was hammered by Penguin or maybe panda but seems to be dragging itself back up (maybe) but has dropped from several thousand visitors/day to 100 or so.Cheers,Ian</form>
-
OK thanks.
Ian
-
There is nothing negative about the message. It is common and expected for sites which use CDNs.
-
Thanks for replying, I have been using a CDN (up until a week ago) and no there haven't been any issues that I can put down to this, I just hadn't seen a message like that before and wondered if it was significant.
-
What is your hosting setup? Are you using some form of cloud hosting such as Amazon or have a CDN set up?
Generally speaking, that message from Google is a good thing. It means they have optimized your crawl rate based on your setup. If you feel there is a need to adjust the rate, it can be done. Are you experiencing any issue?
The Panda/Penguin issue would have no relation to your crawl rate settings.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Huge number of crawl anomalies and 404s - non- existent urls
Hi there, Our site was redesigned at the end of January 2020. Since the new site was launched we have seen a big drop in impressions (50-60%) and also a big drop in total and organic traffic (again 50-60%) when compared to the old site. I know in the current climate some businesses will see a drop in traffic, however we are a tech business and some of our core search terms have increased in search volume as a result of remote-working. According to search console there are 82k urls excluded from coverage - the majority of these are classed as 'crawl anomaly' and there are 250+ 404's - almost all of the urls are non-existent, they have our root domain with a string of random characters on the end. Here are a couple of examples: root.domain.com/96jumblestorebb42a1c2320800306682 root.domain.com/01sportsplazac9a3c52miz-63jth601 root.domain.com/39autoparts-agency26be7ff420582220 root.domain.com/05open-kitchenaf69a7a29510363 Is this a cause for concern? I'm thinking that all of these random fake urls could be preventing genuine pages from being indexed / or they could be having an impact on our search visibility. Can somebody advise please? Thanks!
Technical SEO | | nicola-10 -
Internal link is creating duplicate content issues and generating 404s from website crawl.
Not sure what the best way to describe it but the site is built with Elementor page builder. We are finding out that a feature that is included with a pop modal window renders an HTML code as so: Click So when crawled I think the crawling is linking itself for some reason so the crawl returns something like this: xyz.com/builder/listing/ - what we want what we don't want xyz.com/builder/listing/ xyz.com/builder/listing/%23elementor-action%3Aaction%3Dpopup%3Aopen%26settings%3DeyJpZCI6Ijc2MCIsInRvZ2dsZSI6ZmFsc2V9/ xyz.com/builder/listing/%23elementor-action%3Aaction%3Dpopup%3Aopen%26settings%3DeyJpZCI6Ijc2MCIsInRvZ2dsZSI6ZmFsc2V9//%23elementor-action%3Aaction%3Dpopup%3Aopen%26settings%3DeyJpZCI6Ijc2MCIsInRvZ2dsZSI6ZmFsc2V9/ so you'll notice how that string in the HREF is appended each time and it loops a couple times. Could I 301 this issue, what's the best way to go about handling something like this? It's causing duplicate meta descriptions/content errors for some listing pages we have. I did add a rel='nofollow' to the anchor tag with JavaScript but not sure if that'll help.
Technical SEO | | JoseG-LP0 -
Crawl at a stand still
Hello Moz'ers, More questions about my Shopify migration...it seems that I'm not getting indexed very quickly (it's been over a month since I completed the migration) - I have done the following: used an Seo app to find and complete redirects (right away) used the same app to straighten out title tags, metas and alt tags submitted the sitemap re-submitted my main product URL's via Fetch checked the Console - no reported blocks or crawl errors I will mention that I had to assign my blog to a sub-domain because Shopify's blog platform is awful. I had a lot of 404's on the blog, but fixed those. The blog was not a big source of traffic (I'm an ecomm business) Also, I didn't have a lot of backlinks, and most of those came along anyway. I did have a number of 8XX and 9XX errors, but I spoke to Shopify about them and they found no issues. In the meantime, those issues pretty much disappeared in the MOZ reporting. Any duplicate page issues now have a 200 code since I straightened out the title tags. So what am I missing here? Thanks in advance, Sharon
Technical SEO | | Sharon2016
www.zeldassong.com0 -
How to stop crawls for product review pages? Volusion site
Hi guys, I have a new Volusion website. the template we are using has its own product review page for EVERY product i sell (1500+) When a customer purchases a product a week later they receive a link back to review the product. This link sends them to my site, but its own individual page strictly for reviewing the product. (As oppose to a page like amazon, where you review the product on the same page as the actual listing.) **This is creating countless "duplicate content" and missing "title" errors. What is the most effective way to block a bot from crawling all these pages? Via robots txt.? a meta tag? ** Here's the catch, i do not have access to every individual review page, so i think it will need to be blocked by a robot txt file? What code will i need to implement? i need to do this on my admin side for the site? Do i also have to do something on the Google analytics side to tell google about the crawl block? Note: the individual URLs for these pages end with: *****.com/ReviewNew.asp?ProductCode=458VB Can i create a block for all url's that end with /ReviewNew.asp etc. etc.? Thanks! Pardon my ignorance. Learning slowly, loving MOZ community 😃 1354bdae458d2cfe44e0a705c4ec38dd
Technical SEO | | Jerrion0 -
Cloaking? Best Practices Crawling Content Behind Login Box
Hi- I'm helping out a client, who publishes sale information (fashion sales etc.) In order for the client to view the sale details (date, percentage off etc.) they need to register for the site. If I allow google bot to crawl the content, (identify the user agent) but serve up a registration light box to anyone who isn't google would this be considered cloaking? Does anyone know what the best practice for this is? Any help would be greatly appreciated. Thank you, Nopadon
Technical SEO | | nopadon0 -
Crawl Diagnostics - How to find where broken links are located?
Hi, One of my sites has a 4xx error that has been picked up in the crawl diagnostics section. It is a broken link. Does anybody know if it is possible for me to find out which page the broken link was found on? I have checked all of the pages on the site that I thought were linking to the page that seems to have a problem but all of these links are fine / not broken. Any ideas? Thanks
Technical SEO | | CherryK0 -
Site Crawl
I was wondering if there was a way to use SEOmoz's tool to quickly and easily find all the URLs on you site and not just the ones with errors. The site that I am working on does not have a site map. What I am trying to do is find all the URLs along with their titles and description tags. Thank you very much for your help
Technical SEO | | pakevin0 -
I have 15,000 pages. How do I have the Google bot crawl all the pages?
I have 15,000 pages. How do I have the Google bot crawl all the pages? My site is 7 years old. But there are only about 3,500 pages being crawled.
Technical SEO | | Ishimoto0