Pages Crawl Per Day Gone Drasitcaly Down, is it google issue?
-
Hello Expert,
In search console in Crawl Stats Pages Crawl per day going day by day i.e. from 4 lac pages per day now it is reduce upto 2 lac in last 15 days. So where is the issue? Where I am going wrong or it is issue from google end?
Thanks!
-
It's for sure not an issue on the Google side. Usually things are changing on your side that will influence this. It can be the most minimal change that might have an impact on this. Usually I wouldn't worry about it too much and use your sitemaps to verify what % of your pages is submitted versus indexed.
-
Maybe it has something to do with parameters or robots.txt? When you exclude a lot of URL's at the same time, there are a lot less URL's to crawl. I experienced this myself. The number of URL's in Google's index were going down from 3.600 to 120.
Hope this will help. Good luck!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Search Console Showing 404 errors for product pages not in sitemap?
We have some products with url changes over the past several months. Google is showing these as having 404 errors even though they are not in sitemap (sitemap shows the correct NEW url). Is this expected? Will these errors eventually go away/stop being monitored by Google?
Technical SEO | | woshea0 -
Google Crawling Issues! How Can I Get Google to Crawl My Website Regularly?
Hi Everyone! My website is not being crawled regularly by Google - there are weeks when it's regular but for the past month or so it does not get crawled for seven to eight days. There are some specific pages, that I want to get ranked but they of late are not being crawled AT ALL unless I use the 'Fetch As Google' tool! That's not normal, right? I have checked and re-checked the on-page metrics for these pages (and the website as a whole, backlinking is a regular and ongoing process as well! Sitemap is in place too! Resubmitted it once too! This issue is detrimental to website traffic and rankings! Would really appreciate insights from you guys! Thanks a lot!
Technical SEO | | farhanm1 -
Google is Still Blocking Pages Unblocked 1 Month ago in Robots
I manage a large site over 200K indexed pages. We recently added a new vertical to the site that was 20K pages. We initially blocked the pages using Robots.txt while we were developing/testing. We unblocked the pages 1 month ago. The pages are still not indexed at this point. 1 page will show up in the index with an omitted results link. Upon clicking the link you can see the remaining un-indexed pages. Looking for some suggestions. Thanks.
Technical SEO | | Tyler1230 -
Crawl Diagnostics: Duplicate Content Issues
The Moz crawl diagnostic is showing that I have some duplicate content issues on my site. For the most part, these are variations of the same product that are listed individually (i.e size/color). What would be the best way to deal with this? Choose one variation of the product and add a canonical tag? Thanks
Technical SEO | | inhouseseo0 -
Can Googlebot crawl the content on this page?
Hi all, I've read the posts in Google about Ajax and javascript (https://support.google.com/webmasters/answer/174992?hl=en) and also this post: http://moz.com/ugc/can-google-really-access-content-in-javascript-really. I am trying to evaluate if the content on this page, http://www.vwarcher.com/CustomerReviews, is crawlable by Googlebot? It appears not to be. I perused the sitemap and don't see any ugly Ajax URLs included as Google suggests doing. Also, the page is definitely indexed, but appears the content is only indexed via its original source (Yahoo!, Citysearch, Google+, etc.). I understand why they are using this dynamic content, because it looks nice to an end-user and requires little to no maintenance. But, is it providing them any SEO benefit? It appears to me that it would be far better to take these reviews and simply build them into HTML. Thoughts?
Technical SEO | | danatanseo0 -
To many links on page. Big or small issue for eCommerce
On my site I have around 3k pages and about 90 categories. Most of which have a sensible number of products but some have only a few products and some have loads. if I have say 40 links on the page ignoring the producand is it a big problem if I have more than 60 products on the page? Assuming a link limit per page of 100 user wise we have filters and sorts for thme to find what they breed without issue. But simply from an seo point of view how damaging would I be to have the 23 "to many links on page issues? Worth fixing by making two categories and splitting out products even tho it would hinder the user.
Technical SEO | | mark_baird0 -
Sending signals to Google to rank the correct page for a set of Keywords.
Hi All, Out of all our keywords their are 3 that are showing our home page in the serps rather than the specific product page URL on Google.co.za (Google.com ranks the correct URL) Im not sure why this is happening as most links built using the anchor text are pointing to the correct page. Why would google prefer ranking our home page on local search and rank the correct page on Google.com? (only 3 keywords have this problem) I have tried to correct this by creating links from strong internal pages with anchor text pointing to the correct URL. I have also concentrated on building links from .co.za domains using the anchor text and correct URL but to no avail. It has been 2 weeks now, since i tried to sort it out, but im not sure what else i can do to tell Google to rank the correct page. Any ideas? Regards Greg
Technical SEO | | AndreVanKets0 -
Same Video on Multiple Pages and Sites... Duplicate Issues?
We're rolling out quite a bit of pro video and hosting on a 3-party platform/player (likely BrightCove) that also allows us to have the URL reside on our domain. Here is a scenario for a particular video asset: A. It's on a product page that the video is relevant for. B. We have an entry on our blog with the video C. We have a separate section of our site "Video Library" that provides a centralized view of all videos. It's there too. D. We eventually give the video to other sites (bloggers, industry educational sites etc) for outreach and link-building. A through C on our domain are all for user experience as every page is very relevant, but are there any duplicate video issues here? We would likely only have the transcript on the product page (though we're open to suggestions). Any related feedback would be appreciated. We want to make this scalable and done properly from the beginning (will be rolling out 1000+ videos in 2010)
Technical SEO | | SEOPA0