How can I find all broken links pointing to my site?
-
I help manage a large website with over 20M backlinks and I want to find all of the broken ones. What would be the most efficient way to go about this besides exporting and checking each backlink's reponse code?
Thank you in advance!
-
To find all broken links pointing to your site, you can use various online tools such as Google Search Console, Ahrefs, or SEMrush. These tools allow you to analyze your website's backlink profile and identify any links that lead to pages returning 404 errors or other status codes indicating broken or inaccessible content. Additionally, you can manually check for broken links by reviewing your website's referral traffic, monitoring social media mentions, and conducting periodic audits of your site's content and backlinks.
-
To find all broken links pointing to your site, you can use online tools like Google Search Console's "Links to Your Site" report, which lists external pages linking to your site. Additionally, you can utilize website crawling tools such as Screaming Frog or Ahrefs' Site Explorer to identify broken links from external sources. Regularly monitoring and fixing broken links helps maintain website health, improves user experience, and enhances SEO performance.
-
You can find broken links pointing to your website by using website crawl tools like Screaming Frog or Ahrefs, checking crawl errors in Google Search Console, and monitoring your backlinks with tools like Ahrefs or SEMrush. Regularly checking your referral traffic and using online broken link checkers can also help you identify broken links.
-
You can find broken links pointing to your website by using website crawl tools like Screaming Frog or Ahrefs, checking crawl errors in Google Search Console, and monitoring your backlinks with tools like Ahrefs or SEMrush. Regularly checking your referral traffic and using online broken link checkers can also help you identify broken links.
-
We often use Moz Pro, its a fantastic SEO tool, we also use Screaming Frog as well, we use this to find any broken internal links.
this has helped improve our on-page seo, for our garden office company.
-
Ha, I feel silly. I do use ahrefs, but somehow the broken backlinks tool escaped me. This is perfect, thank you!
-
Hi Steven,
I assume many of these backlinks will be broken because pages were removed from your site without being properly redirected. If that is the case, Open Site Explorer's Link Opportunities (Link Reclamation) tool should be a big help. This will show all 404 URLs with inbound links that you can recapture be 301 redirecting. Additionally, you can look up the backlinks to each of these 404 pages and reach out to each webmaster requesting they update the URL of their link.
I've also had success exporting Top Pages reports (Moz or Majestic are my preferred tools for this), running any URL with a backlink to it through Screaming Frog and pulling 404 pages/broken links (or even 302 redirects) that way. I usually find additional opportunities that do not show up in the Link Reclamation report.
Hope this helps!
-
Use ahrefs and split the crawls for the main folders of the website. Actually, consider the priorities because then you don't have to do all of the 20m. Start with the main ones and go step by step for being able to crawl the majority.
-
I agree with Kevin. Ahref has that capability assuming you don't run into size constraints. Here's a quick post that explains where to find it. (See https://ahrefs.com/blog/turning-broken-links-site-powerful-links-ahrefs-broken-link-checker/.)
-
Have you looked into ahrefs? I know a ton of horsepower behind it, but don't know if it can handle checking 20m. Good luck!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Weird Site is linking to our site and links appears to be broken
I have got a lot of weird links indexed from this page: http://kzs.uere.info/files/images/dining-table-and-2-upholstered-chairs.html When clicking the link it shows 404. Also, the spam score is huge. What do you guys suggest to do with this?
Intermediate & Advanced SEO | | Miniorek
Could it be done by somebody to get our rankings down or domain penalized? Best Regards
Mike & Alex0 -
Some site's links look different on google search. For example Games.com › Flash games › Decoration games How can we do our url's like this?
For example Games.com › Flash games › Decoration games How can we do our url's like this?
Intermediate & Advanced SEO | | lutfigunduz0 -
Can a large fluctuation of links cause traffic loss?
I've been asked to look at a site that has lost 70/80% if their search traffic. This happened suddenly around the 17th April. Traffic dropped off over a couple of days and then flat-lined over the next couple of weeks. The screenshot attached, shows the impressions/clicks reported in GWT. When I investigated I found: There had been no changes/updates to the site in question There were no messages in GWT indicating a manual penalty The number of pages indexed shows no significant change There are no particular trends in keywords/queries affected (they all were.) I did discover that ahrefs.com showed that a large number of links were reported lost on the 17th April. (17k links from 1 domain). These links reappeared around the 26th/27th April. But traffic shows no sign of any recovery. The links in question were from a single development server (that shouldn't have been indexed in the first place, but that's another matter.) Is it possible that these links were, maybe artificially, boosting the authority of the affected site? Has the sudden fluctuation in such a large number of links caused the site to trip an algorithmic penalty (penguin?) Without going into too much detail as I'm bound by client confidentiality - The affected site is really a large database and the links pointing to it are generated by a half dozen or so article based sister sites based on how the articles are tagged. The links point to dynamically generated content based on the url. The site does provide a useful/valuable service/purpose - it's not trying to "game the system" in order to rank. That doesn't mean to say that it hasn't been performing better in search than it should have been. This means that the affected site has ~900,000 links pointing to is that are the names of different "entities". Any thoughts/insights would be appreciated. I've expresses a pessimistic outlook to the client, but as you can imaging they are confused and concerned. LVSceCN.png
Intermediate & Advanced SEO | | DougRoberts0 -
Need help on SEO for my site. Can't figure out what is wrong.
My site, findyogi.com, isn't ranking well in google SERPs. For some good content and matching keyword, my pages are ranking 200+ whereas other sites that have similar or lower authority are ranking in top 10. I must be doing something fundamentally wrong but can't seem to figure out what. I am not looking at ranking 1 on google right now but my pages don't appear even on page 2-4. Sample Keyword- "Samsung galaxy s4 price in india" . Matching page - www.findyogi.com/mobiles/samsung/samsung-galaxy-s4-b94a37/price Please help.
Intermediate & Advanced SEO | | namansr0 -
Googlebot Can't Access My Sites After I Repair My Robots File
Hello Mozzers, A colleague and I have been collectively managing about 12 brands for the past several months and we have recently received a number of messages in the sites' webmaster tools instructing us that 'Googlebot was not able to access our site due to some errors with our robots.txt file' My colleague and I, in turn, created new robots.txt files with the intention of preventing the spider from crawling our 'cgi-bin' directory as follows: User-agent: * Disallow: /cgi-bin/ After creating the robots and manually re-submitting it in Webmaster Tools (and receiving the green checkbox), I received the same message about Googlebot not being able to access the site, only difference being that this time it was for a different site that I manage. I repeated the process and everything, aesthetically looked correct, however, I continued receiving these messages for each of the other sites I manage on a daily-basis for roughly a 10-day period. Do any of you know why I may be receiving this error? is it not possible for me to block the Googlebot from crawling the 'cgi-bin'? Any and all advice/insight is very much welcome, I hope I'm being descriptive enough!
Intermediate & Advanced SEO | | NiallSmith1 -
Can changing dynamic url of over 2000 pages site after a year will change its ranking
Hi- Have built site in joomla The urls are dynamic in nature with over a year - all pages are well indexed and backlinks been built over with these dynamic urls Need to know if i hire an agency to change over dynamic url to static url of these 2000 pages - will it also change all Search engine ranking positions of existing urls Will all the seo effort and backlinks build over 15 months will still hold valid or this will just back to square one due to change of urls is it advisable to get the url changed from dynamic to static one - especially when site is receiving over 75,000 visitors every month Thanks in advance. Look for expert suggestions
Intermediate & Advanced SEO | | Modi0 -
Site #2 beats site #1 in every aspect?
Hey guys, loving SEOMoz so far and will definitely continue my subscription after the free trial. I have a question however, which I am really confused about. When researching my primary keyword, I have found that the second ranked site beats the top site in every single aspect, apart from domain age, which is almost 6 years for the top one and 6 months for the second. When I say every single aspect, I mean everything. More authority for the page and domain, more links, more anchor text links, more authoritive links, more social signals, more relevant links, better domain (although second ranked site is a .net), better MozRank, better MozTrust etc.... I have noticed though, that in the UK SERPs, those sites are switched, so #2 is actually #1. Could it be that the US SERPs just haven't updated yet, or am I missing something completely different.
Intermediate & Advanced SEO | | darrenspeed1 -
Can your site be penalised by backlinks?
Hi, I just wanted to get some clarification on whether Google would penalize your site if you had many links coming from a questionable site. We've been struggling with rankings for years even though we have one of the oldest sites in the industry with a good link profile and the site is well optimized. I was looking through webmaster tools and noticed that one website links to us over 100,000 times, all to the home page. The site is www.vietnamfuntravel.com. When I looked at the site it seems that they operate a massive links exchange, I'm not sure what the history is and why they link to us so much though. Is there any chance that this could impact us negatively? if it is then what would be the best way to deal with the situation? I could ask them to take the links down but can't guarantee they would do it quickly (if at all). Would blocking their domain from our htaccess file have the desired effect?
Intermediate & Advanced SEO | | Maximise0