Re-running Crawl Diagnostics
-
I have made a bunch of changes thanks to the Crawl Diagnostics Tool but now need to re-run as I have lost where I started and what still needs to be done. How do I re-run the crawl diagnostic tool?
-
Thanks everyone!
-
Thanks Roberto, you beat me to that answer. The one limitation with that is it's a max of 3000 URLs per crawl, but it is the best way to crawl you site outside of the regular cycle.
-
The dashboard report only runs every 7 days, so you will have to wait until then. If you want the information you can use this to generate the raw data http://pro.seomoz.org/tools/crawl-test or use screaming frog.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Why might my websites crawl rate....explode?
Hi Mozzers, I have a website with approx 110,000 pages. According to search console, Google will usually crawl, on average, anywhere between 500 - 1500 pages per day. However, lately the crawl rate seems to have increased rather drastically: 9/5/16 - 923
Reporting & Analytics | | Silkstream
9/6/16 - 946
9/7/16 - 848
9/8/16 - 11072
9/9/16 - 50923
9/10/16 - 60389
9/11/16 - 17170
9/12/16 - 79809 I was wondering if anyone could offer any insight into why may be happening and if I should be concerned?
Thanks in advance for all advice.0 -
On Google Analytics, Pages that were 301 redirected are still being crawled. What's the issue here?
URL that we redirected are being crawled on Google Analytics. Since they dont exist, they have high bounce rates. What can the issue be?
Reporting & Analytics | | prestigeluxuryrentals.com0 -
Re-Launched Website: Developer Fogot to Remove noindex tags.
Our company's website has maintained decent rankings for the last 12 years we've been in business for our primary keywords. We recently had our website rebuilt from the ground up, and the developers left the noindex tags on all of our 400+ pages when we launched it. I didn't catch the error for 6 days. During which time, I used the Fetch feature in Google, submitting a site-wide fetch, as well as manual submissions for our top 100 URLs . In addition, every page that was indexed previously had a 301 set up for it, which was pointing to a destination with a noindex.
Reporting & Analytics | | yogitrout1
I caught the error today, and the developer removed the tags. Does anyone have any experience with a situation similar to this? In the SERPs, we are still ranking at this moment, and it's displaying our old URLs, and they are 301 redirecting just fine. But, what happens now? For 6 full days, we told Google not to index any of our pages, while also using the Fetch feature, contradicting ourselves.
Any words of wisdom or advice as to what I can do at this point to avoid potential fall out? Thanks0 -
Webmaster tools crawl errors
Hi there, iv been tracking my webmaster tools crawl errors for a while now(6 months) and im noticing some pages that are far gone 404 are still poping out on the crawl errors. - that pages have no data for xml linking, and remote linking are from pages that are far gone 404 also. that pages have 404 error page + redirect to homepage, and google still notice them with old cache content. does someone have a clue why is this happening?
Reporting & Analytics | | Or.Shvartz0 -
Get a list of robots.txt blocked URL and tell Google to crawl and index it.
Some of my key pages got blocked by robots.txt file and I have made required changes in robots.txt file but how can I get the blocked URL's list. My webmaster page Health>blocked URL's shows only number not the blocked URL's.My first question is from where can I fetch these blocked URL's and how can I get them back in searches, One other interesting point I see is that blocked pages are still showing up in searches.Title is appearing fine but Description shows blocked by robots.txt file. I need urgent recommendation as I do not want to see drop in my traffic any more.
Reporting & Analytics | | csfarnsworth0 -
Run Crawl Diagnostics
hi i have fix some error refering the error list how to re-run crawl diagnostics immediate again to check the error ? thanks
Reporting & Analytics | | AlfredLim0 -
I have two campaigns that are only crawling one page, why is this?
I have a total of three campaigns running right now, and two of them are only crawling one page. I set the campaigns up the same, what is the problem?
Reporting & Analytics | | SiteVamp0 -
Subdomain and relative link paths cause crawl errors
I have a Wordpress blog on our subdomain and we use relative paths on our domain. It appears as though Google bot is crawling from the subdomain categories back to the domain relative paths. This of course results in hundreds of 404 pages. Any suggestions as to how to resolve this issue without changing the relative path structure of our domain? I can provide more information if need be. While I realize these issues are not that pressing, I'd obviously like to remove as many errors as possible. If anyone has encountered this problem, especially in Wordpress I'd really like to hear your solution or lack there of. Thank you in advance.
Reporting & Analytics | | BethA0