Re-running Crawl Diagnostics
-
I have made a bunch of changes thanks to the Crawl Diagnostics Tool but now need to re-run as I have lost where I started and what still needs to be done. How do I re-run the crawl diagnostic tool?
-
Thanks everyone!
-
Thanks Roberto, you beat me to that answer. The one limitation with that is it's a max of 3000 URLs per crawl, but it is the best way to crawl you site outside of the regular cycle.
-
The dashboard report only runs every 7 days, so you will have to wait until then. If you want the information you can use this to generate the raw data http://pro.seomoz.org/tools/crawl-test or use screaming frog.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Seeing very few pages analysed re: Mobile usability, in Google Seach Console - why?
Hi Mozzers, Under Mobile Usability, in Google Search Console, I am seeing very few website pages getting analysed - 10 out of 40 static pages, on the website in question. Is this to be expected or does this indicated an indexing problem on mobile?
Reporting & Analytics | | McTaggart0 -
Are these Search Console crawl errors a major concern to new client site?
We recently (4/1) went live with a new site for a client of ours. The client site was originally Point2 before they made the switch to a template site with Real Estate Webmasters. Now when I look into the Search Console I am getting the following Crawl Errors: 111 Server Errors (photos) 104 Soft 404s (blogs, archives, tags) 6,229 Not Found (listings) I have a few questions. The server errors I know not a lot about so I generally ignore. My main concerns are the 404s and not found. The 404s are mostly tags and blog archives which I wonder if I should leave alone or do 301s for each to /blog. For not found, these are all the previous listings from the IDX. My assumption is these will naturally fall away after some time, as the new ones have already indexed. But I wonder what I should be doing here and which will be affecting me. When we launched the new site there was a large spike in clicks ( 250% increase) which has now tapered off to an average of ~85 clicks versus ~160 at time of launch. Not sure if the Crawl Errors have any effect, I'm guessing not so much right now. I'd appreciate your insights Mozzers!
Reporting & Analytics | | localwork0 -
Google Crawl Stats
Hi all Wondering if anyone could help me out here. I am seeing massive variations in WMT of google crawl stats on a site I run. Just wondering if this is normal (see attached image). The site is an eccommerce site and gets a handful of new products added every couple of weeks. The total no of products is about 220k so this is only a very small %. I notice in WMT I have an amber warning under Server connectivity. About 10 days back I had warnings under DNS, Server and Robots. This was due to bad server performance. I have since moved to a new server and the other two warnings have gone back to green. I expect the Server connectivity one to update any day now. Ive included the graph for this incase it is relevant here. Many thanks for assistance. Carl crawlstats.png connect.png
Reporting & Analytics | | daedriccarl0 -
How can I see what Google sees when it crawls my page?
In other words, how can see the text and what not it sees from start to finish on each page. I know there was a site, but I can't remember it.
Reporting & Analytics | | tiffany11030 -
Re-Launched Website: Developer Fogot to Remove noindex tags.
Our company's website has maintained decent rankings for the last 12 years we've been in business for our primary keywords. We recently had our website rebuilt from the ground up, and the developers left the noindex tags on all of our 400+ pages when we launched it. I didn't catch the error for 6 days. During which time, I used the Fetch feature in Google, submitting a site-wide fetch, as well as manual submissions for our top 100 URLs . In addition, every page that was indexed previously had a 301 set up for it, which was pointing to a destination with a noindex.
Reporting & Analytics | | yogitrout1
I caught the error today, and the developer removed the tags. Does anyone have any experience with a situation similar to this? In the SERPs, we are still ranking at this moment, and it's displaying our old URLs, and they are 301 redirecting just fine. But, what happens now? For 6 full days, we told Google not to index any of our pages, while also using the Fetch feature, contradicting ourselves.
Any words of wisdom or advice as to what I can do at this point to avoid potential fall out? Thanks0 -
Unexplained Crawl Diagnostic Errors & Opencart
Hi, I've been looking at the crawl diagnostics for my site and trying to fix the errors that are showing up but Seomoz is producing some strange results. It's saying pages are duplicated upto 16 times but those pages dont exist. It's adding "page=3", "page=4" to the end of the product URL but I don't see how it's finding those pages, nothing on the site(as far as I can tell) is linking to them. There is no "page=3", just the one product page. Again on the duplicate content it's saying under the "other URLs" there's URLs like "http:///product-a" but again I don't see where it's finding these URLs and obviously those URL's dont work. Those three slashes aren't a typo either. So far I've reduced the amount of errors from 2,005 to 543 but the rest of them I can't make sense of. Also, what does one do when you have two products, eg: "product-a-white" and "product-a-black" to prevent Seomoz from seeing duplicates? Canonical links wont work because there's no parent item, just those two. Google Webmaster tools doesn't seem to have a problem though. Using Opencart 1.5, if it helps. Cheers,
Reporting & Analytics | | AsOneDesign0 -
I have two campaigns that are only crawling one page, why is this?
I have a total of three campaigns running right now, and two of them are only crawling one page. I set the campaigns up the same, what is the problem?
Reporting & Analytics | | SiteVamp0 -
Never Crawled!
This page has not been crawled for five months: http://www.flowerpetal.com/index.jsp?info=13 1/2 that time it was linked to from the homepage of a pr5 site: http://flowerpetal.com/ Why is this the case?
Reporting & Analytics | | tylerfraser0