Crawl rate dropped to zero
-
Hello, I recently moved my site in godaddy from cpanel to managed wordpress. I bought this transfer directly from GoDaddy customer service. in this process they accidentally changed my domain from www to non www. I changed it back after the migration, but as a result of this sites craw rate from search console fell to zero and has not risen at all since then.
In addition to this website does not display any other errors, i can ask google manually fetch my pages and it works as before, only the crawl rates seems to be dropped permanently. GoDaddy customer service also claims that do not see any errors but I think, however, that in some way they caused this during the migration when the url changed since the timing match perfectly. also when they accidentally removed the www, crawl rate of my sites non www version got up but fell back to zero when I changed it back to www version. Now the crawl rate of both www and non www version is zero. How do I get it to rise again? Customer service also said that the problem may be related to ftp-data of search console? But they were not able to help any more than .Would someone from here be able to help me with this in anyway please?
-
Hello, asnwers to the questions bolded:
- At this rate, how long would it take Google to crawl all of your pages, (maybe it feels 10-15 is fast enough)? Over 50 days, i still cannot believe that it would be just a coincidence that crawl rate dropped so suddenly only because google suddenly thinks that my page should not be crawled that often. After all, amount of new content, quality of new links and all the other factors are much better all the time on my site, and before the drop, crawl rate increased steadily. It has to be some technical issue?
- Has the average response time increased? If so, maybe Google feels it's overloading the server & backing off. No, it has actually went down a little bit (not much though)
-
Interesting. I have 2 more thoughts:
- At this rate, how long would it take Google to crawl all of your pages, (maybe it feels 10-15 is fast enough)?
- Has the average response time increased? If so, maybe Google feels it's overloading the server & backing off.
-
Crawl rate still is extremely slow, average 10-15 per day except when i sent pages to be manually crawled, then it crawls those page. Before the drop the crawl rate was never under 200 per day and it was usually over 1000. anything more I can do? It seems to have no effect my rankings or anything else as l can see, but I still would like this be fixed. It has be something to do with the fact that i changed my hosting to godaddy managed wordpress hosting. but they have no clue about what could cause this. robot.txt file change seemed to have no effect or very minimum effect
-
Not that I'm aware of, unfortunately. Patience is an important skill when dealing with Google
-
Thanks! I will try that. I see that search console shows crawl rates with few days delay, is there somewhere i could check if it works instantly?
-
I thought of one other possibility: Your sitemap.xml is probably auto-generated, so this shouldn't be a problem, but check to make sure that the URLs in the sitemap.xml have the www.
Other than that I'm out of ideas - I would wait a few days to see what happens, but maybe someone else with more experience watching Google will have seen this before. If it does resolve, I'd like to know what worked.
-
I'm not convinced that robots.txt is causing your problem, but it can't hurt to change it back. In fact, while looking for instructions on how to change it I came across this blog post by Joost de Valk, (aka Yoast), that pretty much says you should remove everything that's currently in your robots.txt - and his arguments are right for everything:
- Blocking wp-content/plugins will stop Google from loading JS and/or CSS resources that it might need to render the page properly.
- Blocking wp-admin is redundant, because the wp-admin if it's linked it can still be found, and important pages already have an X-Robots HTTP header that says not to index them.
If you're using Yoast SEO, here are instructions on how to change the robots.txt file.
-
Hi, one more thing. Are you 100% sure tht robot.txt file hs nothing to do with this? It changed at the sime time when the problems started to occur. It used to be :
User-agent: *
Disallow: /wp-admin/
Allow: /wp-admin/admin-ajax.phpBut now it is :
User-agent: *
Crawl-delay: 1
Disallow: /wp-content/plugins/
Disallow: /wp-admin/At the sime time "blocked resources" notifications started to occur in search console.
Blocked Resources > Rendering without certain resources can impair the indexing of your web pages. Learn more.Status: 3/19/16152 Pages with blocked resources
This has to have something to do with it right?
-
Thank you for your answer, my answers bolded here below
- Do you see any crawl errors in the Google Search Console? **Nothing new after the crawl rate dropped, just some old soft 404 errors and old not found errors. **
- If you search for your site on Google, what do you see, (does your snippet look normal)? Yes everything looks perfectly normal, just like before when the crawl rate dropped
- How many pages does Google say it has indexed? Is it possible it's indexed everything and is taking a break, (does it even do that?) I dont thin this is possible, since the cralw rate dropped lmost instantly from average 400 to zero after the site migration.
One theory is: When you moved to the non-www version of the site, Google started getting 301s redirecting it from www to non-www, and now that you've gone back to www it's getting 301s redirecting it from from non-www to www, so it's got a circular redirect. If this is the problem, how should i start to get it fixed?
Here's what I would do to try to kick-start indexing, if you haven't already:
- Make sure you have the "Preferred Domain" set to the www version of your site in_ both the www and non-www versions of your site_ in Google Search Console. Yes that is how it has been all the time
- In the Search Console for the www-version of your site, re-submit your sitemap. Done
- In the Search Console for the www-version of your site, do a Fetch as Google on your homepage, and maybe a couple of other pages, and when the Fetch is done use the option to submit those pages for indexing, (there's a monthly limit on how much of this you can do). I have done this many times since i noticed the problem, fetch as google works normally without any issues
Is there anything more i can do? If i want hire someone to fix this, is there any recommendations? I am not a tech guy so this is quite difficult task for me
-
I don't know why this is happening, but this is what I would check:
- Do you see any crawl errors in the Google Search Console?
- If you search for your site on Google, what do you see, (does your snippet look normal)?
- How many pages does Google say it has indexed? Is it possible it's indexed everything and is taking a break, (does it even do that?)
One theory is: When you moved to the non-www version of the site, Google started getting 301s redirecting it from www to non-www, and now that you've gone back to www it's getting 301s redirecting it from from non-www to www, so it's got a circular redirect.
Here's what I would do to try to kick-start indexing, if you haven't already:
- Make sure you have the "Preferred Domain" set to the www version of your site in both the www and non-www versions of your site in Google Search Console.
- In the Search Console for the www-version of your site, re-submit your sitemap.
- In the Search Console for the www-version of your site, do a Fetch as Google on your homepage, and maybe a couple of other pages, and when the Fetch is done use the option to submit those pages for indexing, (there's a monthly limit on how much of this you can do).
Good luck!
-
That's not so horrible - it just says not to crawl the plugins directory or the admin, and to delay a second between requests. You probably don't want your plugins or admin directories being indexed, and according to this old forum post Google ignores the crawl-delay directive, so the robots.txt isn't the problem.
-
Hi, my robot.txt file looks like this:
User-agent: * Crawl-delay: 1 Disallow: /wp-content/plugins/ Disallow: /wp-admin/ This is not how it suppose to look like, right? could this cause the problem?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Sudden drop in Rankings after 301 redirect
Greetings to Moz Community. Couple of months back, I have redirected my old blog to a new URL with 301 redirect because of spammy links pointed to my old blog. I have transfer all the posts manually, changed the permalink structure and 301 redirected every individual URL. All the ranking were boosted within couple of weeks and regained the traffic. After a month I have observed, the links pointed to old site are showing up in Webmaster Tools for the new domain. I was shocked (no previous experience) and again Disavowed all links. Today, all the positions went down for new domain. My questions are: 1. Did the Disavow tool worked this time with new domain? All the links pointed to old domain were devaluated? Is this the reason for ranking drop? Or 2. 301 Old domain with Unnatural links causes the issue? 3. Removing 301 will help to regain few keyword positions? I'm taking this as a case study. Already removed the 301 redirect. Looking for solid discussion.Thanks.
Technical SEO | | praveen4390 -
Number of indexed pages dropped dramatically
The number of indexed pages for my site was 1100 yesterday and today is 344 Anybody has any idea what can cause this. Thank you Sina
Technical SEO | | SinaKashani0 -
Rankings drop after Panda
Hi All, My site dropped completely out of the SERPS on September 27th. I've tried everything I know to do (re-wrote all content, disavow links tool, filed DCMA complaints, de-optimized on-page content, made anchor text less aggressive, etc). Can you all please take a look at www.doctorloanusa.com and let me know what you think the problem is and how much you'd charge to help? Keywords used to be: doctor loans, physician loans. I ranked 2 or 3 for those keywords consistently for over 4 years. I know I need more content, but I feel like it's a waste of time creating it. If a thin site was the issue, wouldn't I at least rank SOMEWHERE in the 1000 results? Thanks for your consideration. At my wits end.
Technical SEO | | Cary_Forest0 -
Huge drop in ranking, but only for a single search term
Hello All, We've just noticed a huge drop in ranking for one of our key search terms. On January 9th we dropped from page 1 (Google UK) to page 7 - out of the top 50 completely! Nothing has changed on the page for a number of months, and rankings for other search terms seem unaffected. Has anyone else been affected recently in a similar manor? Sorry if this sounds a little vague - but we're struggling to understand how this might have happened. It seems very odd that only this particular search term would be affected so dramatically. Any input would be greatly appreciated. Thanks
Technical SEO | | Safelincs0 -
My landing pages just dropped to zero in webmaster tools
According to google webmaster tools my landing pages just dropped from 1300 impressions 2 days ago to zero for the past 2 days. Have attached snippet of graph, URL of website is http://www.cheapcentralheating.co.uk - I have no idea whats happened here, and if anyone can advise or help I would be extremely grateful. landing_pages.jpg
Technical SEO | | nicklemonpromotions0 -
Google Crawler Error / restricting crawling
Hi On a Magento Instance we manage there is an advanced search. As part of the ongoing enhancement of the instance we altered the advance search options so there are less and more relevant. The issue is Google has crawled and catalogued the advanced search with the now removed options in the query string. Google keeps crawling these out of date advanced searches. These stale searches now create a 500 error. Currently Google is attempting to crawl these pages twice a day. I have implemented the following to stop this:- 1. Submitted requested the url be removed via Webmaster tools, selecting the directory option using uri: http://www.domian.com/catalogsearch/advanced/result/ 2. Added Disallow to robots.txt Disallow: /catalogsearch/advanced/result/* Disallow: /catalogsearch/advanced/result/ 3. Add rel="nofollow" to the links in the site linking to the advanced search. Below is a list of the links it is crawling or attempting to crawl, 12 links crawled twice a day each resulting in a 500 status. Can anything else be done? http://www.domain.com/catalogsearch/advanced/result/?bust_line=94&category=55&color_layered=128&csize[0]=0&fabric=92&inventry_status=97&length=0&price=5%2C10http://www.domain.com/catalogsearch/advanced/result/?bust_line=115&category=55&color_layered=130&csize[0]=0&fabric=0&inventry_status=97&length=116&price=3%2C10http://www.domain.com/catalogsearch/advanced/result/?bust_line=94&category=55&color_layered=126&csize[0]=0&fabric=92&inventry_status=97&length=0&price=5%2C10http://www.domain.com/catalogsearch/advanced/result/?bust_line=0&category=55&color_layered=137&csize[0]=0&fabric=93&inventry_status=96&length=0&price=8%2C10http://www.domain.com/catalogsearch/advanced/result/?bust_line=0&category=55&color_layered=142&csize[0]=0&fabric=93&inventry_status=96&length=0&price=4%2C10http://www.domain.com/catalogsearch/advanced/result/?bust_line=0&category=55&color_layered=137&csize[0]=0&fabric=93&inventry_status=96&length=0&price=5%2C10http://www.domain.com/catalogsearch/advanced/result/?bust_line=0&category=55&color_layered=142&csize[0]=0&fabric=93&inventry_status=96&length=0&price=5%2C10http://www.domain.com/catalogsearch/advanced/result/?bust_line=0&category=55&color_layered=135&csize[0]=0&fabric=93&inventry_status=96&length=0&price=5%2C10http://www.domain.com/catalogsearch/advanced/result/?bust_line=0&category=55&color_layered=128&csize[0]=0&fabric=93&inventry_status=96&length=0&price=5%2C10http://www.domain.com/catalogsearch/advanced/result/?bust_line=0&category=55&color_layered=127&csize[0]=0&fabric=93&inventry_status=96&length=0&price=4%2C10http://www.domain.com/catalogsearch/advanced/result/?bust_line=0&category=55&color_layered=127&csize[0]=0&fabric=93&inventry_status=96&length=0&price=3%2C10http://www.domain.com/catalogsearch/advanced/result/?bust_line=0&category=55&color_layered=128&csize[0]=0&fabric=93&inventry_status=96&length=0&price=10%2C10http://www.domain.com/catalogsearch/advanced/result/?bust_line=0&category=55&color_layered=122&csize[0]=0&fabric=93&inventry_status=96&length=0&price=8%2C10
Technical SEO | | Flipmedia1120 -
Blocking AJAX Content from being crawled
Our website has some pages with content shared from a third party provider and we use AJAX as our implementation. We dont want Google to crawl the third party's content but we do want them to crawl and index the rest of the web page. However, In light of Google's recent announcement about more effectively indexing google, I have some concern that we are at risk for that content to be indexed. I have thought about x-robots but have concern about implementing it on the pages because of a potential risk in Google not indexing the whole page. These pages get significant traffic for the website, and I cant risk. Thanks, Phil
Technical SEO | | AU-SEO0 -
Google crawl rate almost zero since re-launch, organic search up 50% though!
We're confused as to why Google's crawl of our site has dropped hugely since our new site went live. The URLs of almost all pages changed, and were 301d to the new site. About 20% of our pages were blocked by robots.txt for the re-launch. The re-launch has been great for organic search, with hits up about 50%. Yet our new content is taking a lot longer to get indexed than before. Our KB downloaded a day according to webmaster tools are well down, as is time spent downloading a page. Any ideas as to why this is?i7hwX.png
Technical SEO | | soulnafein0