Crawl Diagnostics Report Lacks Information
-
When I look at the crawl diagnostics, SEOMoz tells me there are 404 errors.
This is understandable, because some pages were removed.
What this report doesn't tell me is how those pages were discovered.
This is a very important piece of information, because it would tell me there are links pointing to those pages, either internal or external. I believe the internal links have been removed.
If the report told me how if found the link, I would be able to take immediate action. Without that information, I have to go so a lot of investigation. And when you have a million pages, that isn't easy.
Some possibilities:
- The crawler remembered the page from the previous crawl.
- There was a link from an index page - i.e. it is in the database still
- There was an individual link from another story - so now there are broken links
- Ditto, but it in on a static index page
- The link was from an external source - I need to make a redirect
Am I missing something, or is this a feature the SEO Moz crawler doesn't have yet?
What can I do (other than check all my pages) to discover this?
-
OK thank you, Ralph
I can work on that.
-
I think it's the SEOMoz crawler, but what I have found is that the error reports are limited here whereas GWT is much bigger and shows the links leading to the error. My guess is that SEOMoz limit the number of crawl errors they show due to limitations set on their crawler i.e. while their crawl is comprehensive, it's not going to capture what Google does.
-
Thank you Ralph.
Yes, had it for years. So is this a GWT report? I thought it was SEOMoz !
No not IIS, Linux.
-
If you download the csv file for the crawl you can sort it by http status to get all of the 404 errors together. Then there is a specific column that contains the referrer that provides the information you are after.
-
This may be a silly question, but have you got Google Webmaster tools installed? That will show you the source of the errors.
If your site is on IIS then you should also use the awesome IIS SEO toolkit provided by Microsoft for free.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate title on Crawl
Ok, this should hopefully be a simple one. Not sure if this is a Moz crawl issue of redirect issue. Moz is reporting duplicate title for www.site.co.uk , site.co.uk and www.site.co.uk/home.aspx is this a canonical change or a moz setting I need to get this number lower.
Moz Pro | | smartcow0 -
Crawl Diagnostics
My site was crawled last night and found 10,000 errors due to a Robot.txt change implemented last week in between Moz crawls. This is obviously very bad so we have corrected it this morning. We do not want to wait until next Monday (6 days) to see if the fix has worked. How do we force a Moz crawl now? Thanks
Moz Pro | | Studio330 -
Duplicate page report
We ran a CSV spreadsheet of our crawl diagnostics related to duplicate URLS' after waiting 5 days with no response to how Rogerbot can be made to filter. My IT lead tells me he thinks the label on the spreadsheet is showing “duplicate URLs”, and that is – literally – what the spreadsheet is showing. It thinks that a database ID number is the only valid part of a URL. To replicate: Just filter the spreadsheet for any number that you see on the page. For example, filtering for 1793 gives us the following result: | URL http://truthbook.com/faq/dsp_viewFAQ.cfm?faqID=1793 http://truthbook.com/index.cfm?linkID=1793 http://truthbook.com/index.cfm?linkID=1793&pf=true http://www.truthbook.com/blogs/dsp_viewBlogEntry.cfm?blogentryID=1793 http://www.truthbook.com/index.cfm?linkID=1793 | There are a couple of problems with the above: 1. It gives the www result, as well as the non-www result. 2. It is seeing the print version as a duplicate (&pf=true) but these are blocked from Google via the noindex header tag. 3. It thinks that different sections of the website with the same ID number the same thing (faq / blogs / pages) In short: this particular report tell us nothing at all. I am trying to get a perspective from someone at SEOMoz to determine if he is reading the result correctly or there is something he is missing? Please help. Jim
Moz Pro | | jimmyzig0 -
Duplicate content in SEOMOZ report
Hi guys, The SEOMOZ report shows there is duplicate content on my Magento ecommerce: footdistrict.com Example: http://footdistrict.com/nike-air-royalty-386169602.html?___store=footdistrict_en Duplicate content shown on the report: http://footdistrict.com/marcas/puma.html?___store=footdistrict_en
Moz Pro | | footd
http://footdistrict.com/new-balance-m400rk.html?___store=footdistrict_en
http://footdistrict.com/new-balance-gm500mbn.html?___store=footdistrict_en
http://footdistrict.com/new-balance-m400nnb.html?___store=footdistrict_en My guess is that this is due to the fixed footer that we have set where modal windows pop up with delivery info and so on. As such, all the content within it is repeated through all the pages What do you recommend me to remove this duplicate content? I have read about duplicate content issues but they don't usually deal with div tag duplicate issues, modal windows and so on. Thanks Regards0 -
How to remove URLS from from crawl diagnostics blocked by robots.txt
I suddenly have a huge jump in the number of errors in crawl diagnostics and it all seems to be down to a load of URLs that should be blocked by robots.txt. These have never appeared before, how do I remove them or stop them appearing again?
Moz Pro | | SimonBond0 -
Can you set-up a manual SEOmoz crawl?
I received a crawl report yesterday, made some site changes, and would like to see if those changes were done correctly. Rather than wait a week for my automatic crawl to be generated, is there anyway to initiate a manual crawl on a single subdomain as a PRO member? As a PRO member, you can schedule crawls for 2 subdomains every 24 hours, and you'll get up to 3,000 pages crawled per subdomain. When we've finished crawling, your reports will be sent to your PRO email address, which is currently From here... http://pro.seomoz.org/tools/crawl-test
Moz Pro | | ICM0 -
Rankings Report Visits Number timing?
I linked up the reports to my analytics account, and wanted to double check the period it uses for that number. When showing visits what time period is the report showing me? If I double check my stats in GA it looks like the past week but wanted to double check. Thanks!
Moz Pro | | SL_SEM0 -
Is there a problem with Weekly Ranking report this week?
I just received my weekly ranking report. 2 Up, 0 down, 8 the same this week. However, last week my page was #2 for "innovation conference" and this week you report "not in top 50." My first thought was penalty, but when I searched and removed personalization, we were still #2 for that search query. So, something must be wrong with the report - either we're 1 down or "not in the top 50 is wrong."
Moz Pro | | KNect3650