Google Crawling Issues! How Can I Get Google to Crawl My Website Regularly?
-
Hi Everyone!
My website is not being crawled regularly by Google - there are weeks when it's regular but for the past month or so it does not get crawled for seven to eight days. There are some specific pages, that I want to get ranked but they of late are not being crawled AT ALL unless I use the 'Fetch As Google' tool! That's not normal, right? I have checked and re-checked the on-page metrics for these pages (and the website as a whole, backlinking is a regular and ongoing process as well! Sitemap is in place too! Resubmitted it once too! This issue is detrimental to website traffic and rankings! Would really appreciate insights from you guys! Thanks a lot!
-
Hi Farhan
Yes sure, we've definitely found this helps bring the crawlers back to the site more regular!
Hope it helps
Dan
-
Hi Dan! Thanks a lot for your views! This might be it! Content on the website is not updated very regularly owing to the nature of the site. Perhaps updating the blog more regularly or making tweaks in the meta script could help as per your suggestions?
-
Hi Anders! They are all available to the Googlebot! I’m afraid, it won’t be possible for me to share the URL on a public forum! Please check your inbox! Thank you so much for your insights and concern!
-
Hi Umar! There’s been no change in web hosting nor a penalty! We regularly monitor the Search Console and the speed is good too! Thanks a lot for your time though!
-
Hi
How regular are you updating content on the site?
We've found that regular updating and tweaks to content on pages to improve the pages content & user's experience helps the page indexing & frequency.
Looking at analytics and seeing what customers are searching for on your site can help you then improve the content on pages, by going back amending and improving page content we've found Google returns more often to re-index.
Dan
-
Hi!
Are all pages available for GoogleBot, or are they perhaps blocked by robots.txt or a meta tag?
Do you have a URL for the website so we can take a look?
Anders
-
Hello Farhan,
Yes, it's not normal. Did you change your web hosting in this period? or have you faced any kind of penalty in recent past? Usually this happens when you just recovered.
You also need to look out your Search Console for crawling problems, are there any crawling or server issues you're facing? Sometimes, your .htaccess creates crawling problems, make sure to check again. Also, see how fasts your pages are loading, if the speed is low it also cause the crawling delayed.
Hope this helps!
Umar
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google search console image indexing issue
Google search console tells that only '58 out of the 3553' images in the images sitemap are indexed. But if I search "site:example.com" in Google images there seem to be lots of images. There are no errors in the sitemap and I am still getting reasonable number of image search hits daily. Are the webmaster tools stats for images indexed accurate? When I click on the Sitemap Errors & Index Errors this is what i get - Error details: No errors found. https://www.screencast.com/t/pqL62pIc
Technical SEO | | 21centuryweb0 -
Google's ability to crawl AJAX rendered content
I would like to make a change to the way our main navigation is currently rendered on our e-commerce site. Currently, all of the content that appears when you click a navigation category is rendering on page load. This is currently a large portion of every page visit’s bandwidth and even the images are downloaded even if a user doesn’t choose to use the navigation. I’d like to change it so the content appears and is downloaded only IF the user clicks on it, I'm planning on using AJAX. As that is the case it wouldn’t not be automatically on the site(which may or may not mean Google would crawl it). As we already provide a sitemap.xml for Google I want to make sure this change would not adversely affect our SEO. As of October this year the Webmaster AJAX crawling doc. suggestions has been depreciated. While the new version does say that its crawlers are smart enough to render AJAX content, something I've tested, I'm not sure if that only applies to content injected on page load as opposed to in click like I'm planning to do.
Technical SEO | | znotes0 -
What's going on with google index - javascript and google bot
Hi all, Weird issue with one of my websites. The website URL: http://www.athletictrainers.myindustrytracker.com/ Let's take 2 diffrenet article pages from this website: 1st: http://www.athletictrainers.myindustrytracker.com/en/article/71232/ As you can see the page is indexed correctly on google: http://webcache.googleusercontent.com/search?q=cache:dfbzhHkl5K4J:www.athletictrainers.myindustrytracker.com/en/article/71232/10-minute-core-and-cardio&hl=en&strip=1 (that the "text only" version, indexed on May 19th) 2nd: http://www.athletictrainers.myindustrytracker.com/en/article/69811 As you can see the page isn't indexed correctly on google: http://webcache.googleusercontent.com/search?q=cache:KeU6-oViFkgJ:www.athletictrainers.myindustrytracker.com/en/article/69811&hl=en&strip=1 (that the "text only" version, indexed on May 21th) They both have the same code, and about the dates, there are pages that indexed before the 19th and they also problematic. Google can't read the content, he can read it when he wants to. Can you think what is the problem with that? I know that google can read JS and crawl our pages correctly, but it happens only with few pages and not all of them (as you can see above).
Technical SEO | | cobano0 -
Google Cache can't keep up with my 403s
Hi Mozzers, I hope everyone is well. I'm having a problem with my website and 403 errors shown in Google Webmaster Tools. The problem comes because we "unpublish" one of the thousands of listings on the site every few days - this then creates a link that gives a 403. At the same time we also run some code that takes away any links to these pages. So far so good. Unfortunately Google doesn't notice that we have removed these internal links and so tries to access these pages again. This results in a 403. These errors show up in Google Webmaster Tools and when I click on "Linked From" I can verify that that there are no links to the 403 page - it's just Google's Cache being slow. My question is a) How much is this hurting me? b) Can I fix it? All suggestions welcome and thanks for any answers!
Technical SEO | | HireSpace1 -
Can I speed up removal of cache for 301'd page on unverified website?
I recently asked another website to remove a page from their website (I have no control over this website) and they have now 301'd this old URL to another - this is just what I wanted. My only aim now is to see the Google cache removed for that page as quickly as possible.
Technical SEO | | Mark_Reynolds
I'm not sure that asking the website to remove the url via WMT is the right way to go and assume I should just be waiting for Google to pick up the 301 and naturally remove the cache. But are there any recommended methods I can use to speed this process up? The old URL was last cached on 3 Oct 2014 so not too long ago. I don't think the URL is linked from any other page on the Internet now, but I guess it would still be in Google's list of URLs to crawl. Should I sit back and wait (who knows how long that would take?) or would adding a link to the old URL from a website I manage speed things up? Or would it help to submit the old URL to Google's Submission tool? URL0 -
How to improve ranking of a website again, after being penalized by Google?
The ranking of our website has gone down in past 2 months. The reason,
Technical SEO | | TGA123
I believe is that we had more than 300,000 spammy comments posted on it
(the website is based on wordpress) so Google treated it as
un-monitored forum and penalized. We have deleted the older comments
and new comments can no longer be posted. Need suggestions on what else
should we do to rank better. Any advice would be very welcome.0 -
How can i increase my website traffic
Hello, my boss has decide a build website we have more than 12500 products in ourwebsite its mtscellular.com, im new as seo but im confused and need help i want to know how somebody help me to increase my website traffic
Technical SEO | | jimmylora0 -
I just found something weird I can't explain, so maybe you guys can help me out.
I just found something weird I can't explain, so maybe you guys can help me out. In Google http://www.google.nl/#hl=nl&q=internet. The number 3 result is a big telecom provider in the Netherland called Ziggo. The ranking URL is https://www.ziggo.nl/producten/internet/. However if you click on it you'll be directed to https://www.ziggo.nl/#producten/internet/ HttpFox in FF however is not showing any redirects. Just a 200 status code. The URL https://www.ziggo.nl/#producten/internet/ contains a hash, so the canonical URL should be https://www.ziggo.nl/. I can understand that. But why is Google showing the title and description of https://www.ziggo.nl/producten/internet/, when the canonical URL clearly is https://www.ziggo.nl/? Can anyone confirm my guess that Google is using the bulk SEO value (link juice/authority) of the homepage at https://www.ziggo.nl/ because of the hash, but it's using the relevant content of https://www.ziggo.nl/producten/internet/ resulting in a top position for the keyword "internet".
Technical SEO | | NEWCRAFT0