Find all external 404 errors/links?
-
Hi All,
We have recently discovered a site was linking to our site but it was linking to an incorrect url, resulting in a 404 error. We had only found this by pure chance and wondered if there was a tool out there that will tell us when a site is linking to an incorrect url on our site?
Thanks
-
If you dont have access to the logs that could be an issue - not really any automated tools out there as it would need to crawl every website and find 404 errors.
I haven't tried this - so its just an idea. Go into GSC download all the links pointing to your site (and from places like Moz, Ahrefs, Majestic) and then chuck that list of urls into Screaming Frog or URL Profiler and look at external links and see if any are returning a 404. Not sure if this would work - its just an idea.
Thanks
Andy
-
Great, will take a look. Maybe run a trial to see if it does exactly what I need
Thanks for the info!
-
Good idea!
Although some of our clients that we do SEO for aren't hosting their websites on our server and we don't have access to their server logs etc.
Was hoping for an automated dashboard like MOZ/Screaming Frog/ or A hrefs as mentioned above. Due to the amount of clients we have, opening up and running through all there Log files could be time consuming.
Cheers for the info though, may come in use in the future, or to someone else reading this
-
Hi
The best way I have found is to look in your server logs, its the only true place to find out what Google is doing on your site.
Download the logs and look at all the 404 errors - quite simple and depending on size of your logs can take you around 5 minutes worth a work - the longer time period you can analyse in your logs the better.
Thanks
Andy
-
Hi David.
Ahrefs.com offers that service: broken links.
Another way to do that search could be this: Downloading the historic backlinks list and with a mass checker, check where do they point nowdays. I've used GScraper and its option to crawl outbound links.
Best Luck.
GR.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Massive Influx of Total Links - But External Links are Dropping?
Hey Moz Community, I was checking out the Links on one of my client's sites, as they were hit with spammy external links about a year ago, and noticed a large influx of Total Links to the site. According to Moz, external links have actually dropped over the last few months, so I can only assume they are internal links. But, I don't see how my client could add so many internal links over the past 5 months, as they don't do much besides upload new products (they're an ecommerce clothing retailer) via Shopify. They haven't added much over the past half year either. Total links were about 130K in Oct 2019; today, the site has almost 1 million. I've attached some screenshots for reference via Moz to better illustrate the issue. Appreciate any insights into this. Thank you in advance! hhCCUsk lyGltZD
Technical SEO | | EdenPrez0 -
Disavow links and domain of SPAM links
Hi, I have a big problem. For the past month, my company website has been scrape by hackers. This is how they do it: 1. Hack un-monitored and/or sites that are still using old version of wordpress or other out of the box CMS. 2. Created Spam pages with links to my pages plus plant trojan horse and script to automatically grab resources from my server. Some sites where directly uploaded with pages from my sites. 3. Pages created with title, keywords and description which consists of my company brand name. 4. Using http-referrer to redirect google search results to competitor sites. What I have done currently: 1. Block identified site's IP in my WAF. This prevented those hacked sites to grab resources from my site via scripts. 2. Reach out to webmasters and hosting companies to remove those affected sites. Currently it's not quite effective as many of the sites has no webmaster. Only a few hosting company respond promptly. Some don't even reply after a week. Problem now is: When I realized about this issue, there were already hundreds if not thousands of sites which has been used by the hacker. Literally tens of thousands of sites has been crawled by google and the hacked or scripted pages with my company brand title, keywords, description has already being index by google. Routinely everyday I am removing and disavowing. But it's just so much of them now indexed by Google. Question: 1. What is the best way now moving forward for me to resolve this? 2. Disavow links and domain. Does disavowing a domain = all the links from the same domain are disavow? 3. Can anyone recommend me SEO company which dealt with such issue before and successfully rectified similar issues? Note: SEAGM is company branded keyword 5CGkSYM.png
Technical SEO | | ahming7770 -
Ignore these external links reported in GWT?
Taking a long, Ace-Ventura-like breath here. This question is loaded. Here we go: No manual actions reported against my client's site in GWT HOWEVER, a link: operator search for external links to my client's website shows NO links in results. That seems like a very bad omen to me. OSE shows 13 linking domains, but not the one that's listed in the next bullet point. Issue: In Google Webmaster Tools, I noticed 1,000+ external links to my client's website all coming from riddim-donmagazine.com (there are a small handful of other domains listed, but this one stuck out for the large quantity of links coming from this domain) Those external links all point to two URLs on my client's website. I have no knowledge of any campaigns run by client that would use this other domain (or any schemes for that matter) It appears that this website riddim-donmagazine.com has been suspended by hostgator All of the links were first discovered last year (dates vary but basically August through December 2013) There have not been any newly discovered links from this website reported by GWT since those 2013 dates All of the external links are /? based. Example: http://riddim-donmagazine.com/?author=1&paged=31 If I run that link in preceding bullet point through http://www.webconfs.com/http-header-check.php, or any others from riddim-donmagazine.com those external links return 302 status. My best guess is at one time the client was running an advertising program and this website may have been on that network. One of the external links points to an ad page on the client's website.
Technical SEO | | EEE3
(web.archive.org confirms this is a WordPress site and that it's coverage of Bronx news could trigger an ad for my client or make it related to my client's website when it comes to demographics.) Believe me, this externally linked domain is only a small problem in comparison with the rest of my client's issues (mainly they've changed domains, then they changed website vendors, etc., etc.), but I did want to ask about that one externally linked domain. Whew.Thanks in advance for insights/thoughts/advice!0 -
Junk/ Spammy Links Help
I am trying to get rid of all of the junk/spammy links from a site. Problem I am running into is OSE is only showing 7 links and there are way more than that. Is there a reason for only a few links are showing? How would I go about seeing all the links so I can get rid of them? Thank you in advance, Scott
Technical SEO | | scottdrost0 -
Are 404 Errors a bad thing?
Good Morning... I am trying to clean up my e-commerce site and i created a lot of new categories for my parts... I've made the old category pages (which have had their content removed) "hidden" to anyone who visits the site and starts browsing. The only way you could get to those "hidden" pages is either by knowing the URLS that I used to use or if for some reason one of them is spidering in Google. Since I'm trying to clean up the site and get rid of any duplicate content issues, would i be better served by adding those "hidden" pages that don't have much or any content to the Robots.txt file or should i just De-activate them so now even if you type the old URL you will get a 404 page... In this case, are 404 pages bad? You're typically not going to find those pages in the SERPS so the only way you'd land on these 404 pages is to know the old url i was using that has been disabled. Please let me know if you guys think i should be 404'ing them or adding them to Robots.txt Thanks
Technical SEO | | Prime850 -
My 404 page shows in the report as an error.
How can i make my actual 404 page not show up as a 404 error in the report?
Technical SEO | | LindseyNewman0 -
Mass 404 Checker?
Hi all, I'm currently looking after a collection of old newspaper sites that have had various developments during their time. The problem is there are so many 404 pages all over the place and the sites are bleeding link juice everywhere so I'm looking for a tool where I can check a lot of URLs at once. For example from an OSE report I have done a random sampling of the target URLs and some of them 404 (eek!) but there are too many to check manually to know which ones are still live and which ones have 404'd or are redirecting. Is there a tool anyone uses for this or a way one of the SEOMoz tools can do this? Also I've asked a few people personally how to check this and they've suggested Xenu, Xenu won't work as it only checks current site navigation. Thanks in advance!
Technical SEO | | thisisOllie0 -
Linking from and to pages
My website, www.kamperen-bij-de-boer.com, tells people what campingssites can be found in The Netherlands for recreational purposes. In order for a campingsite to be mentioned on our website we ask them to place a link to our website (either using a text link or image link) and then we make a page for that campsite on our website with in the end a link to ther website, e.g. http://www.kamperen-bij-de-boer.com/Minicamping-In-t-Oldambt.html -> they in return link back to us. Since this comes natural will this or won't this be penalized by Google and so on for linkfarming. At this moment we have about 600 camping sites on our website alone linking to us (not all of them) and we are linking to them. Since this can be explained as link trading which is not as good for your ranking as one-way-linking what should be wise? Should i include a nofollow? I already have many links from other sites linking to mine without having to link back, is there anything else i can do with linking to ensure better ranking?
Technical SEO | | JarnoNijzing0