How can I find all broken links pointing to my site?
-
I help manage a large website with over 20M backlinks and I want to find all of the broken ones. What would be the most efficient way to go about this besides exporting and checking each backlink's reponse code?
Thank you in advance!
-
To find all broken links pointing to your site, you can use various online tools such as Google Search Console, Ahrefs, or SEMrush. These tools allow you to analyze your website's backlink profile and identify any links that lead to pages returning 404 errors or other status codes indicating broken or inaccessible content. Additionally, you can manually check for broken links by reviewing your website's referral traffic, monitoring social media mentions, and conducting periodic audits of your site's content and backlinks.
-
To find all broken links pointing to your site, you can use online tools like Google Search Console's "Links to Your Site" report, which lists external pages linking to your site. Additionally, you can utilize website crawling tools such as Screaming Frog or Ahrefs' Site Explorer to identify broken links from external sources. Regularly monitoring and fixing broken links helps maintain website health, improves user experience, and enhances SEO performance.
-
You can find broken links pointing to your website by using website crawl tools like Screaming Frog or Ahrefs, checking crawl errors in Google Search Console, and monitoring your backlinks with tools like Ahrefs or SEMrush. Regularly checking your referral traffic and using online broken link checkers can also help you identify broken links.
-
You can find broken links pointing to your website by using website crawl tools like Screaming Frog or Ahrefs, checking crawl errors in Google Search Console, and monitoring your backlinks with tools like Ahrefs or SEMrush. Regularly checking your referral traffic and using online broken link checkers can also help you identify broken links.
-
We often use Moz Pro, its a fantastic SEO tool, we also use Screaming Frog as well, we use this to find any broken internal links.
this has helped improve our on-page seo, for our garden office company.
-
Ha, I feel silly. I do use ahrefs, but somehow the broken backlinks tool escaped me. This is perfect, thank you!
-
Hi Steven,
I assume many of these backlinks will be broken because pages were removed from your site without being properly redirected. If that is the case, Open Site Explorer's Link Opportunities (Link Reclamation) tool should be a big help. This will show all 404 URLs with inbound links that you can recapture be 301 redirecting. Additionally, you can look up the backlinks to each of these 404 pages and reach out to each webmaster requesting they update the URL of their link.
I've also had success exporting Top Pages reports (Moz or Majestic are my preferred tools for this), running any URL with a backlink to it through Screaming Frog and pulling 404 pages/broken links (or even 302 redirects) that way. I usually find additional opportunities that do not show up in the Link Reclamation report.
Hope this helps!
-
Use ahrefs and split the crawls for the main folders of the website. Actually, consider the priorities because then you don't have to do all of the 20m. Start with the main ones and go step by step for being able to crawl the majority.
-
I agree with Kevin. Ahref has that capability assuming you don't run into size constraints. Here's a quick post that explains where to find it. (See https://ahrefs.com/blog/turning-broken-links-site-powerful-links-ahrefs-broken-link-checker/.)
-
Have you looked into ahrefs? I know a ton of horsepower behind it, but don't know if it can handle checking 20m. Good luck!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can a duplicate page referencing the original page on another domain in another country using the 'canonical link' still get indexed locally?
Hi I wonder if anyone could help me on a canonical link query/indexing issue. I have given an overview, intended solution and question below. Any advice on this query will be much appreciated. Overview: I have a client who has a .com domain that includes blog content intended for the US market using the correct lang tags. The client also has a .co.uk site without a blog but looking at creating one. As the target keywords and content are relevant across both UK and US markets and not to duplicate work the client has asked would it be worthwhile centralising the blog or provide any other efficient blog site structure recommendations. Suggested solution: As the domain authority (DA) on the .com/.co.uk sites are in the 60+ it would risky moving domains/subdomain at this stage and would be a waste not to utilise the DAs that have built up on both sites. I have suggested they keep both sites and share the same content between them using a content curated WP plugin and using the 'canonical link' to reference the original source (US or UK) - so not to get duplicate content issues. My question: Let's say I'm a potential customer in the UK and i'm searching using a keyword phrase that the content that answers my query is on both the UK and US site although the US content is the original source.
Intermediate & Advanced SEO | | JonRayner
Will the US or UK version blog appear in UK SERPs? My gut is the UK blog will as Google will try and serve me the most appropriate version of the content and as I'm in the UK it will be this version, even though I have identified the US source using the canonical link?2 -
Google is alternating what link it likes to rank on wordpress site and
Hi there, I'm experiencing a problem where google is pick and choosing different links structures to rank my Wordpress site for my main keywords. The site had pretty good #1 rankings for a long time but recently I noticed Google is choosing to rank the page in one of two ways. Let me just say that the original way where it held good rankings looked like this for example: flowers.com/the-most-beautiful-wedding-bouquets/ this is just an example it' is not my site. And when google decides to switch it up it uses this link structure:flowers.com > weddings (this still points to this link flowers.com/the-most-beautiful-wedding-bouquets when I hover my mouse over it) however this link structure that never appeared before and now does, usually has much lower rankings. Please note it's not both link structures being ranked at the same time for the keywords. It's one or the other that google is currently alternating in ranking and I believe it's hurting the sites position.
Intermediate & Advanced SEO | | z8YX9F80
I'm not sure if this is a wordpress settings thats gone wrong or what the problem is but I do know when shows the expanded and descriptive link structure flowers.com/the-most-beautiful-wedding-bouquets the rankings are higher and in 2nd place. I'm hoping by rectifying this I can regain back my position. I'm very grateful for any insight you could offer on why this is happening and how I could fix it. Thank you. PS Wordpress site has several SEO plugins0 -
Site Wide Link Situation
Hi- We have clients who are using an e-commerce cart that sits on a separate domain that appears to be providing site wide links to our clients websites. Therefore, would you recommend disallowing the bots to crawl/index these via a robots.txt file, a no follow meta tag on the specific pages the shopping cart links are implemented on or implement no follow links on every shopping cart link? Thanks!
Intermediate & Advanced SEO | | RezStream80 -
A Client Changed the Link Structure for Their Site... Not Just Once, but Twice
I have a client who's experiencing a number of crawl errors, which I've gotten down fo 9,000 from 18,000. One of the challenges they experience is that they've modified their URL structure a couple times. First it was: site.com/year/month/day/post-name
Intermediate & Advanced SEO | | digisavvy
Then it was: site.com/category/post-name
Now it's: site.com/post-name I'm not sure of the time elapsed between these changes, but enough time has passed that the URLs for the previous two URL structures have been indexed and spit out 404s now. What's the best/clean way to address this issue?I'm not going to create 9k redirect rules obviously, but there's got to be a way to address this issue and resolve it moving forward.0 -
Big Site Wide Link
Hi Guys, I've noticed that Google is starting to de-value site-wide links... Our previous SEO agency sourced us a site wide link on a big website and at the moment within Google Webmaster Tools its showing 749,726 links from this 1 source. Do you think this is too many? Could this be being flagged by Google? Here is the site: http://tinyurl.com/7bttw3b Cheers, Scott
Intermediate & Advanced SEO | | ScottBaxterWW0 -
Increasing Internal Links But Avoiding a Link Farm
I'm looking to create a page about Widgets and all of the more specific names for Widgets we sell: ABC Brand Widgets, XYZ Brand Widgets, Big Widgets, Small Widgets, Green Widgets, Blue Widgets, etc. I'd like my Widget page to give a brief explanation about each kind of Widget with a link deeper into my site that gives more detail and allows you to purchase. The problem is I have a lot of Widgets and this could get messy: ABC Green Widgets, Small XYZ Widgets, many combinations. I can see my Widget page teetering on being a link farm if I start throwing in all of these combos. So where should I stop? How much do I do? I've read more than 100 links on a page being considered a link farm, is that a hardline number or a general guideline?
Intermediate & Advanced SEO | | rball10 -
One site or five sites for geo targeted industry
OK I'm looking to try and generate traffic for people looking for accommodation. I'm a big believer in the quality of the domain being used for SEO both in terms of the direct benefit of it having KW in it but also the effect on CTR a good domain can have. So I'm considering these options: Build a single site using the best, broad KW-rich domain I can get within my budget. This might be something like CheapestHotelsOnline.com Advantages: Just one site to manage/design One site to SEO/market Better potential to resell the site for a few million bucks Build 5 sites, each catering to a different region using 5 matching domains within my budget. These might be domains like CheapHotelsEurope.com, CheapHotelsAsia.com etc Advantages: Can use domains that are many times 'better' by adding a geo-qualifier. This should help with CTR and search Can be more targeted with SEO & Marketing So hopefully you see the point. Is it worth the dilution of SEO & marketing activities to get the better domain names? I'm chasing the longtail searchs whetever I do. So I'll be creating 5K+ pages each targeting a specific area. These would be pages like CheapestHotelsOnline.com/Europe/France/Paris or CheapHoteslEurope.com/France/Paris to target search terms targeting hotels in Paris So with that thought, is SEO even 100% diluted? Say, a link to the homepage of the first option would end up passing 1/5000th of value through to the Paris page. However a link to the second option would pass 1/1000th of the link juice through to the Paris page. So by thet logic, one only needs to do 1/5th of the work for each of the 5 sites ... that implies total SEO work would be the same? Thanks as always for any help! David
Intermediate & Advanced SEO | | OzDave0 -
Are there certain times of the day that it is better to update content or blogs? How do I find out what time is best for a particular site?
Trying to figure out how to best optimize timing of new content... including blogs and other on page content?
Intermediate & Advanced SEO | | AaronSchinke0