Reasons for a sharp decline in pages crawled
-
Hello!
I have a site I've been tracking using Moz since July. The site is mainly stagnant with some on page content updates. Starting the first week of December, Moz crawler diagnostics showed that the number of pages crawled decreased from 300 to 100 in a week.
So did the number of errors through. So crawler issues went from 275 to 50 and total pages crawled went from 190 to 125 in a week and this number has stayed the same for the last 5 weeks.
Are the drops a red flag? Or is it ok since errors decreased also? Has anyone else experienced this and found an issue?
FYI: sitemap exists and is submitted via webmaster tools. GWT shows no crawler errors nor blocked URLs.
-
Google is indexing just over 80 URLs, although about 40% of them are developer test URLs (they lead to live pages of the site though). Nothing in robots.txt. No errors.
The Google bot is still crawling, but it's crawling half the pages. What would make it decrease in page crawls? I'm working if there is a broken link or something on the home page that's pointing away from the site... although it's unlikely, I'll check....
-
If you fixed a problem, such as duplicate content, that would mean that we're showing fewer errors and crawling fewer pages, since that problem is fixed. Might that be the case?
-
How many URLs are indexed in Google if you use site:yourdomain.com Has that figure dropped too?
Have you got anything in your robots.txt that could be blocking?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google ranking penalty: Limited to specific pages or complete website?
Hi all, Let's say few pages on the website dropped in the rankings due to poor optimisation of the pages or hit by algo updates. Does Google limits the ranking drop only to these pages or the entire website will have any impact? I mean will this cause ranking drop to the homepage for primary keyword? Will Google pose the penalty to other pages in the website if few pages drop in the rankings. Thanks
Algorithm Updates | | vtmoz0 -
Page Rank on Moz compared to Ahrefs
So there seems to be a huge philosophical difference behind how Moz and Ahrefs calculates page rank (PA). On Moz, PA is very dependent on a site's DA. For instance, any new page or page with no backlinks for a 90DA site on Moz will have around 40PA. However, if a site has around 40 DA, any new page or page with no backlinks will have around 15PA PA. Now if one were to decide to get tons of backlinks to this 40 DA/15PA page, that will raise the PA of the page slightly, but it will likely never go beyond 40PA....which hints that one would rather acquire a backlink from a page on a high DA site even if that page has 0 links back to it as opposed to a backlink from a page on a low DA site with many, many backlinks to it. This is very different from how Ahrefs calculates PA. For Ahrefs, the PA of any new page or page with no backlinks to it will have a PA of around 8-10ish....no matter what the DA of the site is. When a page from a 40DA site begins acquiring a few links to it, it will quickly acquire a higher PA than a page from a 90DA site with no links to it. The big difference here is that for Ahrefs, PA for a given page is far more dependent on how many inbound links that page has. On the other hand, for Moz, PA for a given page is far more dependent on the DA of the site that page is on. If we were to trust Moz's PA calculations, SEOrs should emphasize getting links from high DA sites....whereas if we were to trust Ahref's PA calculations, SEOrs should focus less on that and more on building links to whatever page they want to rank up (even if that page is on a low DA site). So what do you guys think? Do you agree more with Moz or Ahref's valuation of PA. Is PA of a page more dependent on the DA or more dependent on it's total inbound links?
Algorithm Updates | | ButtaC1 -
Increase of non-relevant back-links drop page ranking?
Hi community, Let's say there is a page with 50 back-links where 40 are non-relevant back-links and only 10 are relevant in-terms of content around the link, etc....Will these non-relevant back-links impact the ranking of the page by diluting the back-link profile? Thanks
Algorithm Updates | | vtmoz0 -
Link reclamation: What happens when backlinks are pointing to other page than the most related page? Any risks?
Hi all, We have started link reclamation process as we failed to redirect our old website links to newly created pages. Unfortunately most of the backlinks are pointing to a page which already has lots of backlinks. Just wondering if I can redirect the old pages to the other pages than the actual related page they must be pointing to make sure only one page doesn't take away all the backlinks. And what happens if Google find that backlink is pointing to a different page than the actual page? Thanks
Algorithm Updates | | vtmoz0 -
Duplicate Product Pages On Niche Site
I have a main site, and a niche site that has products for a particular category. For example, Clothing.com is the main site, formalclothing.com is the niche site. The niche site has about 70K product pages that have the same content (except for navigation links which are similar, but not dupliated). I have been considering shutting down the niche site, and doing a 301 to the category of the main site. Here are some more details: The niche sites ranks fairly well on Yahoo and Bing. Much better than the main site for keywords relevant to that category. The niche site was hit with Penguin, but doesn't seem to have been effected much by Panda. When I analyze a product page on the main site using copyscape, 1-2 pages of the niche site do show, but NOT that exact product page on the niche site. Questions: Given the information above, how can I gauge the impact the duplicate content is having if any? Is it a bad idea to do a canonical tag on the product pages of the niche site, citing the main site as the original source? Any other considerations aside from duplicate content or Penguin issue when deciding to 301? Would you 301 if this was your site? Thanks in advance.
Algorithm Updates | | inhouseseo0 -
Google not crawling click to expand content - suggestions?
It seems like Google confirmed this week in a G+ hangout that content in click to expand content e.g. 'read more' dropdown and tabbed content scenarios will be discounted. The suggestion was if you have content it needs to be visible on page load. Here's more on it https://www.seroundtable.com/google-index-click-to-expand-19449.html and the actual hangout, circa 11 mins in https://plus.google.com/events/cjcubhctfdmckph433d00cro9as. From a UX and usability point of view having a lot of content that was otherwise tabbed or in click to expand divs can be terrible, especially on mobile. Does anyone have workable solutions or can think of examples of really great landing pages (i'm mostly thinking ecommerce) that also has a lot of visible content? Thanks Andy
Algorithm Updates | | AndyMacLean0 -
Ranking dropped with no page changes
My rank for a keyword went from ranking #1 to #22. The page grade for this keyword is A, there was no site structure changes. The only thing I can see is that tumblr and reddit and other sources are now listed for this keyword and it's difficulty went from the mid-low teens to 28%. However, even given that, I do not a see a reason for this keyword alone to fall so far. It was giving us a ton of traffic, in fact, most of our organic search results came from this term for nearly two months. And 2 weeks ago for no reason, we were pushed to page 3. Has anyone else had similar experiences how do you counter it, and what can we do?
Algorithm Updates | | mozmemberanon0 -
Bing's indexed pages vs pages appearing in results
Hi all We're trying to increase our efforts in ranking for our keywords on Bing, and I'm discovering a few unexpected challenges. Namely, Bing is reporting 16000+ pages have been crawled... yet a site:mywebsite.com search on Bing shows less than 1000 results. I'm aware that Duane Forrester has said they don't want to show everything, only the best. If that's the case, what factors must we consider most to encourage Bing's engine to display most if not all of the pages the crawl on my site? I have a few ideas of what may be turning Bing off so to speak (some duplicate content issues, 301 redirects due to URL structure updates), but if there's something in particular we should monitor and/or check, please let us know. We'd like to prioritize 🙂 Thanks!
Algorithm Updates | | brandonRT0