Crawl diagnostics taking too long
-
I started a crawl 2 days ago and it was still going after almost 48 hours so I deleted the entire campaign and resubmitted it. It's been 13 hours and still going. What happened to getting initial results in 2 hours? I've never had this problem and have run several campaign crawls here.
Just wondering if there is a known issue I just can't seem to find?
Thank you
-
Hi Francisco,
Thank you for the advice
-
I have heard of this happening. Normally SEOmoz steps in. Contact them at help@seomoz.com. They will probably have to go in and manually reset that campaign.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Why is my MOZ report only crawling 1 page?
just got this weeks MOZ report and it states that it have only crawled: Pages Crawled: 1 | Limit: 10,000 it was over 1000 a couple of weeks ago, we have moved servers recently but is there anything i have done wrong here? indigocarhire.co.uk thanks
Moz Pro | | RGOnline0 -
Find a 4xx or 5xx link referenced in an SEO Crawl Report
So I just got the Crawl Diagnostics report for a client site and it came back with a number of 4xx errors and even 1 5xx error. So while I can find the URL that has the problem, I cannot find the pages that have the links pointing to these non-existent or problematic pages. Normally I would just search the database for the site, but in this case I don't have access to it as the site is on a proprietary platform with no access other than to the CMS. Is there anyway to get the linking URL from the report? Thanks!
Moz Pro | | farlandlee0 -
Its been over a month, rogerbot hasn't crawled the entire website yet. Any ideas?
Rogerbot has stopped crawling the website at 308 pages past week and has not crawled the website with over 1000+ pages. Any ideas on what I can do to get this fixed & crawling?
Moz Pro | | TejaswiNaidu0 -
When will be the 250 pages crawled limit eliminated?
Hi, I signed up yesterday for a SEOMoz Pro Account, and would like to know, please, when will be the 250 pages crawled limit eliminated? 🙂 Thanks in advance for your help!
Moz Pro | | Andarilho0 -
Crawl Diagnostics Report
I'm a bit concerned about the results I'm getting from the Crawl Diagnostics Report. I've updated the site with canonical urls to remove duplicate content and when I check the site - it all displays the right values, but the report, which has just finished crawling is still showing a lot of pages as duplicate content. Simple example: http://www.domain.com http://www.domain.com/ Both of them are in the duplicate content section although both have canonical url set as: Does each crawl check the entire site from the beginning or just the pages it didn't have a chance to crawl the last time? This is just one of 333 duplicate content pages, which have canonical url pointing to the right page. Can someone please explain?
Moz Pro | | coremediadesign0 -
How to remove Duplicate content due to url parameters from SEOMoz Crawl Diagnostics
Hello all I'm currently getting back over 8000 crawl errors for duplicate content pages . Its a joomla site with virtuemart and 95% of the errors are for parameters in the url that the customer can use to filter products. Google is handling them fine under webmaster tools parameters but its pretty hard to find the other duplicate content issues in SEOMoz with all of these in the way. All of the problem parameters start with ?product_type_ Should i try and use the robot.txt to stop them from being crawled and if so what would be the best way to include them in the robot.txt Any help greatly appreciated.
Moz Pro | | dfeg0 -
Wild fluctuation in number of pages crawled
I am seeing huge fluctuations in the number of pages discovered the crawl each week. Some weeks the crawl discovers > 10,000 pages and other weeks I am seeing 4-500. So, this week for example I was hoping to see some changes reflected for warnings from last weeks report (which discovered > 10,000 pages). However, the entire crawl this week was 448 pages. The number of pages discovered each week seems to go back and forth between these two extremes. The more accurate count would be nearer the 10,000 mark than the 400 range. Thanks. Mark
Moz Pro | | MarkWill0 -
The Site Explorer crawl shows errors for files/folders that do not exist.
I'm fairly certain there is ultimately something amiss on our server but the Site Explorer report on my website (www.kpmginstitutes.com) is showing thousands of folders that do not exist. Example: For my "About Us" page (www.kpmginstitutes.com/about-us.aspx), the report shows a link: www.kpmginstitutes.com/rss/industries/404-institute/404-institute/about-us.aspx. We do have "rss", "industries", "404-institute" folders but they are parallel in the architecture, not sequential as indicated in the error url. Has anyone else seen these types of error in your Site Explorer reports?
Moz Pro | | dturkington0