Reasons for a sharp decline in pages crawled
-
Hello!
I have a site I've been tracking using Moz since July. The site is mainly stagnant with some on page content updates. Starting the first week of December, Moz crawler diagnostics showed that the number of pages crawled decreased from 300 to 100 in a week.
So did the number of errors through. So crawler issues went from 275 to 50 and total pages crawled went from 190 to 125 in a week and this number has stayed the same for the last 5 weeks.
Are the drops a red flag? Or is it ok since errors decreased also? Has anyone else experienced this and found an issue?
FYI: sitemap exists and is submitted via webmaster tools. GWT shows no crawler errors nor blocked URLs.
-
Google is indexing just over 80 URLs, although about 40% of them are developer test URLs (they lead to live pages of the site though). Nothing in robots.txt. No errors.
The Google bot is still crawling, but it's crawling half the pages. What would make it decrease in page crawls? I'm working if there is a broken link or something on the home page that's pointing away from the site... although it's unlikely, I'll check....
-
If you fixed a problem, such as duplicate content, that would mean that we're showing fewer errors and crawling fewer pages, since that problem is fixed. Might that be the case?
-
How many URLs are indexed in Google if you use site:yourdomain.com Has that figure dropped too?
Have you got anything in your robots.txt that could be blocking?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Indexed Pages Increase and Major Drop June 25th and July 16th?
I am seeing information regarding a possible Google algorithm that may have taken place on June 25th...and seeing total number of pages indexed in GSC increase (cool!)...BUT, then on July 16, I'm seeing a consistent drop (BIG DROP) of pages indexed - not only on our site, but several. Does anyone have any insight into this or experiencing the same issue?
Algorithm Updates | | kwilgus0 -
More pages or less pages for best SEO practices?
Hi all, I would like to know the community's opinion on this. A website with more pages or less pages will rank better? Websites with more pages have an advantage of more landing pages for targeted keywords. Less pages will have advantage of holding up page rank with limited pages which might impact in better ranking of pages. I know this is highly dependent. I mean to get answers for an ideal website. Thanks,
Algorithm Updates | | vtmoz1 -
How long for google to de-index old pages on my site?
I launched my redesigned website 4 days ago. I submitted a new site map, as well as submitted it to index in search console (google webmasters). I see that when I google my site, My new open graph settings are coming up correct. Still, a lot of my old site pages are definitely still indexed within google. How long will it take for google to drop off or "de-index" my old pages? Due to the way I restructured my website, a lot of the items are no longer available on my site. This is on purpose. I'm a graphic designer, and with the new change, I removed many old portfolio items, as well as any references to web design since I will no longer offering that service. My site is the following:
Algorithm Updates | | rubennunez
http://studio35design.com0 -
Who else is noticing a shift in deeper pages ranking?
Without mentioning names, we're noticing a shift in many of our clients ranking pages. Previously many of them held page 1 positions with their home page. We've been building brand only anchor text to these pages for some time now and there's a noticeable change in visibility to the domain as a whole displayed in GWT and there's an uplift in organic traffic too. It just happens that some of our clients already had pages in the root directory that were very optimised for the clients' head terms, but all of a sudden, these sub pages with very few inbound links have started ranking in the place of the home pages. I've attached a screenshot of the landing page organic traffic. The pages in question have been there for at least 8-10 months. These inner pages would not normally have been able to hold their ground in this position and I'm concerned that this is a temporary change. I can see this going one of two ways; (i) home page beings to out rank sub page as before, (i) sub page loses ranking ability and home page rank does not come back. My questions to the community are therefore; **Has anyone else noticed this shift in ranking behaviour? ** What are everyone's thoughts on this? - Will it remain this way? From this query I can easily ask another wider question; Good advice across the internet says we should be building strong brand links and citations to our clients' domains. Typically brand links go to the homepage, which should provide the homepage and (to a lesser extent the domain) with a ranking/traffic/visibility uplift. However, as I'm noticing other pages now picking up ranking boosts as a result of this; **Should we still be trying to gain links to these more commercial landing pages? ** How are others building high quality links to pages full of commercial copy? I hope this can spark a little bit of a debate. I look forward to hearing everyone's thoughts. Thanks yPOEjVA.png
Algorithm Updates | | tomcraig860 -
Do you think this page has been algorithmically penalised or is it just old?
Here is the page: http://www.designquotes.com.au/business-blog/top-10-australian-business-directories-in-2012/ It's fairly old, but when it was first written it hit #1 for "business directories". After a while it dropped but was receieving lots of traffic for long tail variations of "business directories Australia" As of the 4th of October (Penguin 2.1) it lost traffic and rankings entirely. I checked it's link profile and there isn't anything fishy: From Google Webmaster https://docs.google.com/spreadsheet/ccc?key=0AtwbT3wshHRsdEc1OWl4SFN0SDdiTkwzSmdGTFpZOFE&usp=sharing In fact, two links are entirely natural http://blog.businesszoom.com.au/2013/09/use-customer-reviews-to-improve-your-website-ranking/ http://dianajones.com.au/google-plus-local-equals-more-business-blog/ Yet when I search for a close match in title in Google AU, the article doesn't appear within even the first 4 pages. https://www.google.com.au/#q=top+10+Australian+Business+Directories&start=10 Is this simple because it's an old article? Should I re-write it, update the analysis and use a rel=canonical on the old article to the new?
Algorithm Updates | | designquotes0 -
Our Rankings are being very inconsistent! Some days we are on the front page, some days we are not in the top 50\. This happens on a weekly and sometimes daily basis... Any thoughts on why this is happening? This newbie appreciates any feedback.
We seem to be having major issues with our rankings. When I came into the company, the company was in the middle of cleaning up some of their past SEO efforts that had caused some issues with some of the latest Google updates. We were able to get the site back up to par, and some of our rankings were improved back to the first page, but then they disappeared. They will head back to the first page and then disappear again on a weekly and sometimes daily basis. Does anyone have any idea on why it will be doing this so inconsistently and so often? This newbie appreciates any feedback!!!!!
Algorithm Updates | | PCMV0 -
After on-page is perfect, then what?
I am working on a friend's site and I've optimized his on-page stuff. It's pretty much perfect from SEOMoz analysis. I reduced his Errors from 150+ to 3 and his warnings from 50+ to 0. Notices are at 5. Now, a month has gone by and I've added 3 backlinks in what appears to be reputable sites PR7, PR7, and PR3. His site has dropped from the 2nd listing on the 2nd page to the bottom of the 2nd page now. Now, I realize other sites may be doing seo as well and trying to move to the top, but, seems peculiar. I put in a bit of work to revamp 3 of his pages as well b/c each one had over 300 html validation errors and I reduced it down to 6 or so. The competitor I'm aiming for has these stats: DA: 16 MozRank 3.14 MozTrust: 2.73 My friend's site is: DA: 19 MozRank: 2.21 MozTrust: 2.64 Subdomain metrics is 0 for everything, whereas his competitors have good stats for subdomain metrics. What is that? And what can I do to improve his ranking in Google? Is it just more backlinks? Thanks.
Algorithm Updates | | webtarget0