Re-running Crawl Diagnostics
-
I have made a bunch of changes thanks to the Crawl Diagnostics Tool but now need to re-run as I have lost where I started and what still needs to be done. How do I re-run the crawl diagnostic tool?
-
Thanks everyone!
-
Thanks Roberto, you beat me to that answer. The one limitation with that is it's a max of 3000 URLs per crawl, but it is the best way to crawl you site outside of the regular cycle.
-
The dashboard report only runs every 7 days, so you will have to wait until then. If you want the information you can use this to generate the raw data http://pro.seomoz.org/tools/crawl-test or use screaming frog.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
641 Crawl Errors In My Moz Report - 190 are high priority Duplicate Content
Hi everyone, There are high and medium level errors. I was surprised to see any especially since Google Analytics shows no errors whatsoever.190 errors - duplicate content.A lot of images are showing in the Moz Crawl Report as errors, and when I click on one of these links in the report, it directs to the image which displays on a blog post on the site unusually since I haven't started blogging yet.. So it looks like all those errors are because the images are appearing on their own post.So for example a picture of a mountain would be referred to with www.domain.com/mountains ; the image would be included in the content on a page but why give an image a page/post all of it's own when that was not my intention. Is there a way I can change this?# ----------------------------------------
Reporting & Analytics | | SEOguy1
These are things I first see at the top of the Moz Report: There are 2 similar home urls at the top of the report: http status code is 200 for both (1) and (2) Link Count for (1) is 71. Link count for (2) is 60. No client or server errors Rel Canonical Rel-Canonical Target
Yes http:// domain. co.uk/home
Yes http:// domain. co.uk/home/ Does this mean that the home page is being seen as a duplicate by Google and the search engines?http status codes on every page is 200.Your help would be appreciated.Best Regards,0 -
On Google Analytics, Pages that were 301 redirected are still being crawled. What's the issue here?
URL that we redirected are being crawled on Google Analytics. Since they dont exist, they have high bounce rates. What can the issue be?
Reporting & Analytics | | prestigeluxuryrentals.com0 -
Question re Google Analytics and its more accurate alternatives
Hi guys There are two main issues we have with Google Analytics, and I'd really appreciate if anyone has the time to give an answer to that. We completely miss organic traffic data before 7/22/2013 although our account is active since 2005. Any thoughts on that? Is it the not provided move that swiped out all data or something else? Even for the data we do have there is lots of inaccuracies, and we are thinking on switching or at least adding a new analytics software, any recommendations? (FYI, it turns out we do not keep access logs on the server for more than 2 months, and we might fix that for future references, but now we are looking for external solution). Any help will be much appreciated Thanks Lily
Reporting & Analytics | | wspwsp0 -
Google Crawl Stats
Hi all Wondering if anyone could help me out here. I am seeing massive variations in WMT of google crawl stats on a site I run. Just wondering if this is normal (see attached image). The site is an eccommerce site and gets a handful of new products added every couple of weeks. The total no of products is about 220k so this is only a very small %. I notice in WMT I have an amber warning under Server connectivity. About 10 days back I had warnings under DNS, Server and Robots. This was due to bad server performance. I have since moved to a new server and the other two warnings have gone back to green. I expect the Server connectivity one to update any day now. Ive included the graph for this incase it is relevant here. Many thanks for assistance. Carl crawlstats.png connect.png
Reporting & Analytics | | daedriccarl0 -
Webmaster tools crawl errors
Hi there, iv been tracking my webmaster tools crawl errors for a while now(6 months) and im noticing some pages that are far gone 404 are still poping out on the crawl errors. - that pages have no data for xml linking, and remote linking are from pages that are far gone 404 also. that pages have 404 error page + redirect to homepage, and google still notice them with old cache content. does someone have a clue why is this happening?
Reporting & Analytics | | Or.Shvartz0 -
How to get crawled pages indexed?
Hi, I've got over 1k pages crawled but approx 100 pages indexed. Although, i submit them on Google Fetch and the links are indexable,they are not indexed. What shall i do the get max pages indexed? Any input highly appreciated. Thanks!
Reporting & Analytics | | Rubix0 -
URL Re-Structure - Tracking success of it
Hi guys, I was wondering what would be the best approach to track the success of a URL restructure? What we plan to do is to implement the URL re-structure slowly by only having it on new pages which go live for property listings. Any previous listings will use the old URL structure. I thought it would be best to limit any potential problems by testing it on a smaller number of pages. So my question really is, what metrics should I be looking at to determine the success of this given the fact that we remove any property listings once they get rented or sold?
Reporting & Analytics | | MarkScully0 -
SEOMoz & GWT crawl error conflicting info
Site im working on has zero crawl errors according to SEOMoz (it did previously have lots since ironed out) but now looking at GWebmaster Tools saying 5000 errors. Date of those are not that recent but Webmaster Tools line graph of errors still showing aprox 5000 up to yesterday There is an option to bulk action/tick them all as fixed so thinking/hoping GWT just keeping a historical record that can now be deleted since no longer applicable. However i'm not confident this is the case since still showing on the line graph. Any ideas re this anomalous info (can i delete and forget in GWT) ? Also side question I take it its not possible to link a GA property with a GWT account if created with different logins/accounts ? Many Thanks Dan
Reporting & Analytics | | Dan-Lawrence0