Why Do Different Tools Report 404s Differently?
-
Hi Mozers,
How come Moz reports just six 404 errors, whereas Google Search Console reports 250 and Screaming Frog only reports a dozen? It seems to me that these results are all over the place. Shouldn't these reports be more consistent?
I do understand that Search Console includes historical data and that URLs or issues need to be "marked as fixed" in order for them to go away, however, even if I do this, Google ends up reporting far more errors than anything else.
Do 404s reported by Moz and Screaming Frog NOT include external links? It seems to me that this could be partially responsible for the issue.
Also, is there a way to efficiently track the source of the 404s besides clicking on "Linked From" within Search Console 250 times? I was looking for something like this is Moz or SF but no luck.
Any help is appreciated.
Thanksabunch!
-
One more sidenote - you can also use inbound link auditing tools like Moz Open Site Explorer, AHREFS, or especially Cognitive SEO to collect as many of your incoming links as possible, then filter them for which ones 404 and then you'll know which external pages contain the broken incoming links. CognitiveSEO is especially well set up to do this, but it's not a cheap tool (Does have a free trial though).
Hope that helps?
Paul
-
While essential to maintaining the quality of the website, remember this process will only show missing URLs as detected from within the website. It can't show any info about all the broken links that may exist externally on other websites.
P.
-
These tools are designed to report very different things, Eric. SF and Moz are only able to tell you what they can discover about what's happening within your site, since that's all they have access to. (Moz's link index isn't tied into their site crawl reports). So your own pages that are linking to other pages which no longer exist.
Search Console is also showing what's happening on the rest of the web outside your website. So it's picking up and reporting on links on other websites that it's trying to follow but finding 404 at the end on your site.
So, for example, Search Console is the only one that's going to be able to find and report on URLs that you used to have, but that haven't been correctly redirected after the URLs changed, like in a redesign. Assuming you cleaned up your internal linking after the redesign, the only way to know that these other URLs still haven't been properly redirected is via GSC's external visibility.
If GSC is finding a lot of these, and you want to proactively find as many more as possible without waiting for GSC to find them. there are a couple of methods available to try to address in bulk. One way is to run a Screaming Frog crawl of the previous version(s) of the site in the Wayback Machine, then rerun those as a list to discover which ones now 404. Or if you have an old sitemap, you can upload that into SF and run that and check for 404s.
A partial alternative is to run a custom report in GA that shows the referrer to the 404 page. But it will only track actual clickthroughs, whereas GSC actually reports on external broken links, whether they've been clicked on recently or not. So much broader and a bit more proactive. Eoghan at Rebelytics has a great post on assembling these useful reports, including links to directly import from the Solutions Gallery. Do note though, this data can often be pretty "dirty" with direct/(not set) referrers, or very vague ones like "Facebook"
But bottom line - use your Screaming Frog crawl to find and fix the broken links within your own site. USe the GSC list (downloaded) to write appropriate redirects for the URLs it finds that haven't' been covered by your internal fixes. Since you can't do anything about the broken links on third-party sites, the only control you have of them is writing redirects inside your own system. Put another way - it's not worth the effort to find where on the external webs the broken inbound links are from, since you can't do anything about changing them anyway.
Hope that all makes sense?
Paul
P.S. Marking the 404s as fixed in GSC will suppress them for about 90 days, but if the broken links still exist out on the web or in the index and are re-crawled, they will again start to show up in your GSC list until they've been crawled often enough that G decides to trust the 404 and may stop crawling them. Once you apply redirects, G will stop reporting them as 404s. Do note though that if the equivalent content really is gone, there's nothing wrong with letting the broken links 404. Trying to artificially redirect them to unrelated pages (like the home page) will just result in G treating them as (soft)404s anyway.
-
The answer to this is actually quite simple, Moz and Google have data that's not as up to date as Screaming Frog, so Moz and Google can be reporting historical issues that you've resolved allowing them to have larger numbers reported. The other issue is sample size, so Moz for example may not have your complete site in their index, yet Google does.
"Also, is there a way to efficiently track the source of the 404s besides clicking on "Linked From" within Search Console 250 times? I was looking for something like this is Moz or SF but no luck."
As for that question, easiest way is through Screaming Frog as you're able to get "live" checks on your site (every time you crawl your site again). If you look at the attached image you'll see that I'm on the 404 response code, I've clicked on the URL I want to know where it's being linked from (Inlinks) and I've got a list of pages and then other information such as anchor/alt text.
Hopefully this helps you out.
Tom
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What impact has external links reported as http503
Hello I have 72 external broken links who are reported with a http 503 status. When I ask the owner of that site, he confirmed he has removed it from GA by using this status. My question is. Does this have an impact for the quality of my site? The site is available, there is no delay when you open the link. Thank you for your support. br
Moz Pro | | kjerstibakke
Kjersti Bakke0 -
Ranking report and language
Unfortunately I noticed a huge problem in rank tracker amd ranking report. My site is in italian and I configurate the ranking report using google italy. The problem is that the ranking is related to www.google.it but using a browser set with english language!! Infact for a specific keyword the ranking position is 8 (using english) and 12 (using italian). This is a huge problem for me and makes useless any report! F
Moz Pro | | fabrico230 -
Link report that is broken down by C Block?
I've tried to do this in the advanced reports are of Moz, but to no avail. I just want to be able to see all the links (and anchor text would be nice too) for each CBlock.
Moz Pro | | DeluxeCorp0 -
Difference in MozPoints between My Account and Q&A forum
I noticed that there is a big difference between the mozpoints showing in my account and the mozpoint showing in the Q&A section of the website. My Account shows me that I now have: 262 Mozpoints. When i look at my Q&A total is says: 303 mozpoints. There is a big hole between the two. How is this possible and which of the 2 numbers is correct?
Moz Pro | | JarnoNijzing0 -
Comparing our traffic to competitors -- tools?
Are there any good free tools that can track our traffic compared with 2 competitors? We're lookimg for a (rough) graph. We don't have Google Analytics installed yet.
Moz Pro | | BobGW0 -
Keyword ranking tool
Hello, What is best practice for knowing if a keyword is too difficult to try to rank for. i understand the keyword difficulty tool in seoMOZ, but am unsure of how it actually relates to if I should attempt to rank for the keyword. Is there a differential people use such as if a site has a Page Authority of 60 we will not try to rank for a keyword that has a keyword difficulty ranking of 40?
Moz Pro | | digitalops0 -
Seomoz research tools question for inbound links
What is the best seomoz tool and or indicator within a tool for checking on a link (directory or article site) to evaluate whether or not you would actaully want a link from them? Any way to see if google penalizes them for there tactics or if that can hurt me by them linking to me or whatever? I am at the beginning of a link building mission, but want to ensure that I do it methodically and correctly as possible based on the combines wisdom of this community. Thanks for your help, Steven
Moz Pro | | sfmatthews0 -
Moz tool bar showing less links
Just checked our links for a couple of our sites and noticed that the number of inbound links has dropped from around 55,000 to 13,000 on one and from 6000 to 700 on the other. GWMT still showing the previous amounts. Anyone else experienced this over the last few days?
Moz Pro | | heatherrobinson0