Why Do Different Tools Report 404s Differently?
-
Hi Mozers,
How come Moz reports just six 404 errors, whereas Google Search Console reports 250 and Screaming Frog only reports a dozen? It seems to me that these results are all over the place. Shouldn't these reports be more consistent?
I do understand that Search Console includes historical data and that URLs or issues need to be "marked as fixed" in order for them to go away, however, even if I do this, Google ends up reporting far more errors than anything else.
Do 404s reported by Moz and Screaming Frog NOT include external links? It seems to me that this could be partially responsible for the issue.
Also, is there a way to efficiently track the source of the 404s besides clicking on "Linked From" within Search Console 250 times? I was looking for something like this is Moz or SF but no luck.
Any help is appreciated.
Thanksabunch!
-
One more sidenote - you can also use inbound link auditing tools like Moz Open Site Explorer, AHREFS, or especially Cognitive SEO to collect as many of your incoming links as possible, then filter them for which ones 404 and then you'll know which external pages contain the broken incoming links. CognitiveSEO is especially well set up to do this, but it's not a cheap tool (Does have a free trial though).
Hope that helps?
Paul
-
While essential to maintaining the quality of the website, remember this process will only show missing URLs as detected from within the website. It can't show any info about all the broken links that may exist externally on other websites.
P.
-
These tools are designed to report very different things, Eric. SF and Moz are only able to tell you what they can discover about what's happening within your site, since that's all they have access to. (Moz's link index isn't tied into their site crawl reports). So your own pages that are linking to other pages which no longer exist.
Search Console is also showing what's happening on the rest of the web outside your website. So it's picking up and reporting on links on other websites that it's trying to follow but finding 404 at the end on your site.
So, for example, Search Console is the only one that's going to be able to find and report on URLs that you used to have, but that haven't been correctly redirected after the URLs changed, like in a redesign. Assuming you cleaned up your internal linking after the redesign, the only way to know that these other URLs still haven't been properly redirected is via GSC's external visibility.
If GSC is finding a lot of these, and you want to proactively find as many more as possible without waiting for GSC to find them. there are a couple of methods available to try to address in bulk. One way is to run a Screaming Frog crawl of the previous version(s) of the site in the Wayback Machine, then rerun those as a list to discover which ones now 404. Or if you have an old sitemap, you can upload that into SF and run that and check for 404s.
A partial alternative is to run a custom report in GA that shows the referrer to the 404 page. But it will only track actual clickthroughs, whereas GSC actually reports on external broken links, whether they've been clicked on recently or not. So much broader and a bit more proactive. Eoghan at Rebelytics has a great post on assembling these useful reports, including links to directly import from the Solutions Gallery. Do note though, this data can often be pretty "dirty" with direct/(not set) referrers, or very vague ones like "Facebook"
But bottom line - use your Screaming Frog crawl to find and fix the broken links within your own site. USe the GSC list (downloaded) to write appropriate redirects for the URLs it finds that haven't' been covered by your internal fixes. Since you can't do anything about the broken links on third-party sites, the only control you have of them is writing redirects inside your own system. Put another way - it's not worth the effort to find where on the external webs the broken inbound links are from, since you can't do anything about changing them anyway.
Hope that all makes sense?
Paul
P.S. Marking the 404s as fixed in GSC will suppress them for about 90 days, but if the broken links still exist out on the web or in the index and are re-crawled, they will again start to show up in your GSC list until they've been crawled often enough that G decides to trust the 404 and may stop crawling them. Once you apply redirects, G will stop reporting them as 404s. Do note though that if the equivalent content really is gone, there's nothing wrong with letting the broken links 404. Trying to artificially redirect them to unrelated pages (like the home page) will just result in G treating them as (soft)404s anyway.
-
The answer to this is actually quite simple, Moz and Google have data that's not as up to date as Screaming Frog, so Moz and Google can be reporting historical issues that you've resolved allowing them to have larger numbers reported. The other issue is sample size, so Moz for example may not have your complete site in their index, yet Google does.
"Also, is there a way to efficiently track the source of the 404s besides clicking on "Linked From" within Search Console 250 times? I was looking for something like this is Moz or SF but no luck."
As for that question, easiest way is through Screaming Frog as you're able to get "live" checks on your site (every time you crawl your site again). If you look at the attached image you'll see that I'm on the 404 response code, I've clicked on the URL I want to know where it's being linked from (Inlinks) and I've got a list of pages and then other information such as anchor/alt text.
Hopefully this helps you out.
Tom
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Difference Between equity passing and follow links
Hi, I am recently seeing two more new options in the Opensiteexplorer filter options. equity passing links
Moz Pro | | Dexjj
non equity passing links
nofollow
dofollow What is the difference between an equity passing links and dofollow links. Can you guys help me.4 -
Incomplete historical ranking reports
Hei there, I tried to update my datas using your historical ranking reports, but last week the reports were incomplete not having crawled all keywords... Did I do somethign wrong or is that a mistake on oyur side? thanks for hleping me Sarah
Moz Pro | | SEOKrauteratwork0 -
Can't find backlinks shown in report.
I ran an advanced report to show me all the backlinks pointing to a domain. When I go to many of the domains listed, I can't find the link. I've searched the pages by anchor text in the browser and nothing comes up. Anyone know why this would be?
Moz Pro | | PatioLifeStyle0 -
OSE Backlink results - reported link not actually there?
Not a complaint, but a question to understand how the research tool operates: When I run backlink checks on websites, often the reported link is not only not on the page, but it's not found anywhere on the site. I use several tools to search for the link url as well as for the keyword. Why does the tool report a link is there, but I cannot find the links in some cases? Is there a lag in the information the tool is using, making it not quite up to date, or is it something else? Thanks much!
Moz Pro | | AdamThompson0 -
What is the best link building management tool ?
What is the best link building management tool that can automatically fill in content for submission for me, check whether my sites are submitted on other sites, propose new lists of submission sites, organize my link building (by date, anchor text, url, page rank, both back links, reciprocal, paid, etc), organize my social media profiles and connect them to each other ?
Moz Pro | | CretanDevelopments0 -
Site metric tool
Is there a tool on here (or anywhere else) where you can upload a list of sites (say 500 sites) and be given things like mozrank on it?
Moz Pro | | thefresh0 -
SEOmoz vs Google Webmaster Tools on incoming links
I'm working on basic SEO for http://queueassoc.com. Google has indexed the non-www verions of the pages and these are what the SERPS return SEOmoz toolbar shows that all of the incoming links juice goes to the www. versions of the pages, none to the non-www version. Yesterday I set up GWMT for the site, submitted a sitemap with the www version of the pages and set the default address to the www version. I had to verify both versions of the site in order to do this and in looking at the non-www version I saw that Google had all the incoming links there and none in the www.version, the opposite of what SEOmoz shows. Is this just because Google only has the non-www versions in its index? Will they show the links to the www version once they get them in the index? I'm worried about losing Google Page Rank value or SEOmoz DA by making this switch.
Moz Pro | | bvalentine0