Amount of Pages Crawled Dropped Significantly
-
I am just wondering if something changed with the SEOMoz crawler. I was always getting 10,000 or near 10,000 pages crawled. After the last two crawls I am ending up around 2500 pages. Has anything changed that I would need to look at it see if I am blocking the crawler or something else?
-
Hi! This is probably a question for the help desk team. If you send them an email at help@seomoz.org and let them know your campaign name, they'll take a look at it and help you figure out what happened.
-
i also want to add that I ran a test crawl and it actually returned 3200 pages, so I am not sure why the site is only doing 2504 for the last two weeks.
-
i did a site update about 6 weeks ago, but was crawled a few times with about 9,000-10,000 pages crawled. It was the last two crawls that it dropped to 2504 pages. I don't use .htaccess, nothing in robots.txt so I am not sure what could be doing it
-
you may be blocking the mozbot. Do you remember the exact date when it started happening? If yes then you can identify and revert the changes. Also check your robots.txt and .htaccess file for any abnormal changes during this time period.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Should I noindex user-created fundraising pages?
Hello Moz community! I work for a nonprofit where users are able to create their own fundraising pages on the website for supporters to directly donate. Some of them are rarely used, others get updated frequently by the host. There are likely a ton of these on our site. Moz crawl says we have ~54K pages, and when I do a "site:[url]" search on Google, 90% of the first 100 results are fundraising pages. These are not controlled by our staff members, but I'm wondering if meta noindexing these pages could have a big effect on our SEO rankings. Has anyone tried anything similar or know if this strategy could have legs for our site? My only concern is whether users wouldn't be able to find their fundraising page in our Google CSE implemented on the website. Any insight you fine folks could provide would be greatly appreciated!
Moz Pro | | Adam_Paris1 -
Why am I getting all these duplicate pages?
This is going for basically all my pages, but my website has 3 'duplicates' as the rest just have 2 (no index) Why are these 3 variations counting as duplicate pages? http://www.homepage.com http://homepage.com http://www.hompage.com/index.php
Moz Pro | | W2GITeam0 -
Why did my rankings drop so fast?
Hi Mozzers, One of my clients have suffered from a sudden ranking loss that started beginning of August. A little background: My client is a big cleaning company that has many franchises nationwide and had very good rankings for the past few months. Generally the busiest time of the year is summer and the 4th quarter is the slowest. Organic Traffic generally drops by 5 to 10% between Q3 and Q4 of each year. 2 things have caught my attention while investigating these ranking losses The latest recent Update from the moz index evaluated the website's DA of 48 when in the summer we had a DA of 56. Never seen such a drop before!! My first question is how accurate is the moz index? How can I make sure this data is accurate? I am very surprised because we have been working hard on getting good quality links for the past year and a half. If the DA is accurate then that could explain the ranking loss Google updating its search engine to the Hummingbird version. Do you guys think this has anything to do with the drop in rankings? The content on the website is not the best but is fairly decent. I am still not sure how the ranking drop could be related to Hummingbird? I have also checked webmaster tools to see if I got hit somehow by penguin and it seems fine. Thanks mozzers
Moz Pro | | Ideas-Money-Art0 -
Crawl Diagnostics
My site was crawled last night and found 10,000 errors due to a Robot.txt change implemented last week in between Moz crawls. This is obviously very bad so we have corrected it this morning. We do not want to wait until next Monday (6 days) to see if the fix has worked. How do we force a Moz crawl now? Thanks
Moz Pro | | Studio330 -
Crawl Diagnostics Summary Problem
We added our website a Robots.txt file and there are pages blocked by robots.txt. Crawl Diagnostics Summary page shows there is no page blocked by Robots.txt. Why?
Moz Pro | | iskq0 -
On-Page SEO Fixes - Are They Relative?
So, I'm implementing on-page fixes for a site that my company runs SEO services for (www.ShadeTreePowersports.com). However, I was wondering if there was a way to rank a pages' SEO quality, in general? As of now, it seems like the only way your recommendations can be consumed and altered is on a keyword basis. However, this seems be the reason I have a good amount of my F-Grades. Since my website sells powersports apparel and accessories, we cover a variety of applicable (but different) keywords like 'Motorcycle parts' or 'snow tubes,' because we sell so many different types of products. But, when I look at my F-Grades - SEOMoz is telling me my homepage is ranking poorly for a multitude of those pertinent keywords - but only because my page isn't catered specifically to each of them (IE: 'Snowmobile Parts' - 'Water Sport Apparel') But, with so many different types of products, catering to a specific one is impossible and would be detrimental. Is there a way to see how a page ranks, without factoring in those keywords? Or a better way that I can use these recommendations more efficiently? Thanks guys!
Moz Pro | | BrandLabs0 -
Need to find all pages that link to list of pages/pdf's
I know I can do this in OSE page by page, but is there a way I can do this in a large batch? There are 200+ PDF's that I need to figure out what pages (if any) link to the PDF. I'd rather not do this page by page, but rather copy-paste the entire list of pages I'm looking for. Any tools you know of that can do this?
Moz Pro | | ryanwats0 -
Page and Domain Authority and other bits
Hi, I am in the process of finding blogs to have a few articles published with a couple of links in each. Articles will all be unique and relevant to the link I drop in and relevant in someway to the reader However I have a few questions. My site is a designer menswear site, so I have picked fashion and sports sites first and foremost to have the articles published. Now, I have a guy who owns about 30 different websites. 2 of them are sports based and about 10 are fashion based. Around $10-$15 an article. I have ran them all through the Open Site Explorer Tool and picked out the best ranked ones. Now my problem is, how do I know if its a good site to not only list an article on, but to pay for it as well. The sites page ranks are around the 30-45 range, the domains are around the 35-45 range. What is a good range to have? I know the higher the better but is 30-45 good enough to pay for? (I don't mind paying the $10 each (£7 my money) for each one) Also as he is quoted me in dollars, I assume there all USA based, so majority of users are USA based. Well I am UK based and only ship to the UK. Will this matter as much if I am trying to gain backlinks? Obviously a UK based site, would be ideal, but is it a case of getting more external links on the web for Google to find, as long as they are relevant to the user? Any help would be great. Thanks Will
Moz Pro | | WillBlackburn0