Crawl Disgnosis only crawling 250 pages not 10,000
-
My crawl diagnosis has suddenly dropped from 10,000 pages to just 250. I've been tracking and working on an ecommerce website with 102,000 pages (www.heatingreplacementparts.co.uk) and the history for this was showing some great improvements. Suddenly the CD report today is showing only 250 pages! What has happened? Not only is this frustrating to work with as I was chipping away at the errors and warnings, but also my graphs for reporting to my client are now all screwed up. I have a pro plan and nothing has (or should have!) changed.
-
Hey Scott,
I just checked out your campaigns and everything looks good right now. We are really sorry about any inconveniences this may have caused. Let me update you on what happened and what we have done to make sure it doesn't happen in the future.
Over the weekend our server hosting provider experienced some temporary power outages that last for a few hours. When this happened some of our databases that contain user membership status went offline. When this happened our crawlers assumed that the campaigns had been archived and when the database servers came back online then the crawlers thought the campaigns had been unarchived.
In the past we have had the practice of kicking off a 250 page starter crawl when a campaign has been unarchived and then scheduling the full crawl for 7 days out. Your campaign would have received a full crawl on it's next scheduled crawl though. This is much like what happens when you first create your campaign. This isn't ideal for a few reasons though. One being a scenario like what happened over the weekend and two that it can skew your historical data by having a 250 page crawl stuck in the middle, even if archiving was intentionally done.
Moving forward we will be implementing a change to this that makes it so when you unarchive a campaign your full crawl will be scheduled and you won't receive a starter crawl. If you need more immediate crawl data then I recommend using our crawl test tool. With that tool you can receive up to 3,000 pages crawled. The only difference being it comes in the form of a csv file without the pretty web interface.
Let me know if you have any additional questions. Also, in the future if you are experiencing any issues with your service go ahead an let our support team know. If you go to seomoz.org/help you can generate a help ticket quite easily. By generating a customer support ticket our Help Team will keep you up to date on any issues with your account and work with you to resolve any issues as quickly as possible.
Again, my sincere apologies for this issue with your crawl.
Have a great day!
Kenny
-
Many thanks Keri
-
Hi Scott,
We have rolled out a fix for this! I'm waiting to hear how long it will take to get through the backlog of crawls, but did want to let you know that your campaign is being worked on.
Keri
-
Thanks Keri. If you could please keep me informed that will help me to explain this to clients.
regards,
Scott.
-
I think we've had a bug, Scott. A couple of SEOmoz staff also got emails that the starter crawl had finished. We're looking into this to figure out what has happened, and really apologize. I'm assigning this to the help desk, and they'll commenting when we have some more information.
-
If you have run crawler today than yes seomoz default run 250 pages and than crawler takes 7 days to scan all your website pages..
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Why my site not crawl?
hi all me allow rogerbot in robots.txt but rogerbot can't crawl my site my site: toska-co.ir
Moz Pro | | jahanidawodi0 -
Backlink of competitor inner page
How to understand from where the link to a competitor's internal page http://www.exampledomain.com/contest/book are coming? I mean, we want promote the same "event" but we want understanding where the competitor is already present. Open Site Explorer doesn't have any data for the website listed but I'm sure that the page has a lot of backlinks (I've verified the first 50 manually in Google.it) Many Thanks for all your advices.
Moz Pro | | YESdesign0 -
Test page ranking in search engine
Is there a tool on SEOMOZ that allows me to get search engine ranking(s) for a specific page with a specific set of keywords?
Moz Pro | | Webxtrakt0 -
Top 10 SERP Competition PA
I really like the Keyword Difficulty Tool. I'm looking to get into some affiliate marketing and I'm trying to assess a good niche site to create. I have a couple questions about assessing the results of this tool: 1. When looking to compete for top 10 SERPs are Page Authority & Domain Authority closely related? because a page could have a 1 PA and an 80 DA. 2. What individual PA ranking is considered to be surpass-able? below 40? 30? 3. What difficulty percent would you consider worth competing for? or a waste of time? I guess I'm looking for some breaking points here. Thanks!
Moz Pro | | AndySolo0 -
Drop in number of Pages crawled by Moz crawler
What would cause a sudden drop in the number of pages crawled/accessed by the Moz crawler? The site has about 600 pages of content. We have multiple campaigns set up in our Pro account to track different keyword campaigns- but all for the same domain. Some show 600+ pages accessed, while others only access 7 pages for the same domain. What could be causing these issues?
Moz Pro | | AllaO0 -
Why is Roger crawling pages that are disallowed in my robots.txt file?
I have specified the following in my robots.txt file: Disallow: /catalog/product_compare/ Yet Roger is crawling these pages = 1,357 errors. Is this a bug or am I missing something in my robots.txt file? Here's one of the URLs that Roger pulled: <colgroup><col width="312"></colgroup>
Moz Pro | | MeltButterySpread
| example.com/catalog/product_compare/add/product/19241/uenc/aHR0cDovL2ZyZXNocHJvZHVjZWNsb3RoZXMuY29tL3RvcHMvYWxsLXRvcHM_cD02/ Please let me know if my problem is in robots.txt or if Roger spaced this one. Thanks! |0 -
Duplicate page content reports duplicates, but pages don't show duplication
My duplicate page reports shows 376 pages with duplicate content. After reviewing the pages the report claims have duplicate content, i can't find duplications. could this be an error, or is there some source code that doesn't display that could be causing this issue?
Moz Pro | | noonzie0 -
On-Page Optimisation tool on intranet pages
Does anybody know if there's any easy way to use the On-Page Optimisation tool on intranet or not publicly accessible pages? Thanks!
Moz Pro | | neooptic0