Broken Inner Links - Tool Recommendations?
-
Do you have any recommendations for tools that scan an entire website and report broken inner links?
I run several UGC centered websites and broken inner links, and external, is an issue.
Being that these websites are several hundred thousand pages large, I am not really all that excited about running software on my desktop (xenu link sleuth for example). Any online solutions you could recommend would be great!
-
If it happens to be a wordpress site, there is a plugin called something like "Broken Link Checker." If I recall correctly, that checks internal and outbound links. Otherwise, not too sure.
-
Ideric, did any of these suggestions answer your questions, or have you been able to otherwise find a tool for this? I know others would find the information useful.
At a previous company, we had a custom-written solution to check external links, and made it check response headers until a 200 OK showed, or it got five levels deep. What we'd often find is that we'd have a 301 for an external link, and it'd go from non-www to www. Wouldn't necessarily worry about fixing that, but then later realized that from there, the www link was a 404, OR went to a 200 OK category landing page that said "we've reorganized our site, search here for that individual resource".
-
Well you've found the best solution right here at SEOmoz! Instead of wasting time learning new systems to find out if they'll work or not, just solve your problem. Sign with PRO Elite and you can crawl 100,000 pages.
-
I have used this in the past http://www.auditmypc.com/free-sitemap-generator.asp - (Click on the image in the top right of the instructions) a free tool for site map generation that will show broken internal links in the process. I don't think it has any limits to it, although I have not tried it on a site as large as you are suggesting. Just ensure you are not logged into your site when you run it. Although Google webmaster tools is ok, you can't verify changes made very quickly.
-
I think Xenu is your best option here. The size of the site nearly cuts out the chance a web tool could handle it.
Just recently on a site review I had to run Xenu on a site with 160,000 pages. It only took 4 hours running at 30 threads to complete. Any modern PC should handle it fine.
-
WMT is alright, apart from the fact you can't force Google to crawl all your pages. I would doubt that even a majority of the pages were crawled and indexed by Google (though I don't know what the site is).
Plus, as you say, it only deals with internal links and 404s coming in.
Do you know what the upper limit is on how many crawl errors WMT will display?
-
I might be wrong, but I think Google WMT can accomplish this with ease. I'm looking at 1000 right now. Externally you'll probably have to use xenu =/
-
You might be out of luck on a site that size.
I think WebCEO can do this with their online version but to get 100,000 urls crawled I think it'll cost you a bomb (the sort of money that it'd be cheaper to buy a second PC to run Xenu, lol).
Anyway - http://www.webceo.com/ - I think it may also be possible to install the download version to a server and run it that way.
-
I use Google webmaster tools. Go to diagnostics, then crawl errors.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Regarding Internal Links
I analyse my Birthday Page "https://www.giftalove.com/birthday"with comapare link profiles and found that total Internal Link 47,234. How my internal link suddenly increse. Please provide my details about my internal links.
Technical SEO | | Packersmove0 -
Can you regain any SERPs / link juice of links that have 404'd?
We have a client whose 301 redirects disappeared and have been gone for about 6 months now. We are going to be putting the 301 redirects back in place. Will we be able to regain any of the previous SERPs or link juice from old links or is all lost? Thanks in advance!
Technical SEO | | SavvyPanda0 -
Why my external links are zero
What could be the possibility that my Moz crawler showing zero external link for my website http://ultimatecharter.com, i have build many links from different website and when i click them it goes to the website. My website is multi language and the landing page is http://ultimatecharter.com/en/home can this be a possible issue? regards Aqeel
Technical SEO | | Aqeel0 -
Correct linking to the /index of a site and subfolders: what's the best practice? link to: domain.com/ or domain.com/index.html ?
Dear all, starting with my .htaccess file: RewriteEngine On
Technical SEO | | inlinear
RewriteCond %{HTTP_HOST} ^www.inlinear.com$ [NC]
RewriteRule ^(.*)$ http://inlinear.com/$1 [R=301,L] RewriteCond %{THE_REQUEST} ^./index.html
RewriteRule ^(.)index.html$ http://inlinear.com/ [R=301,L] 1. I redirect all URL-requests with www. to the non www-version...
2. all requests with "index.html" will be redirected to "domain.com/" My questions are: A) When linking from a page to my frontpage (home) the best practice is?: "http://domain.com/" the best and NOT: "http://domain.com/index.php" B) When linking to the index of a subfolder "http://domain.com/products/index.php" I should link also to: "http://domain.com/products/" and not put also the index.php..., right? C) When I define the canonical ULR, should I also define it just: "http://domain.com/products/" or in this case I should link to the definite file: "http://domain.com/products**/index.php**" Is A) B) the best practice? and C) ? Thanks for all replies! 🙂
Holger0 -
Difference between SEOMOZ and Webmaster Tools information
Hello, There is an issue that confuses me and I thought perhaps you will be able to help me shed some light on it. I have a website which shows 2,549 crawled pages on SEOMOZ and 24,542 pages on webmaster tools! Obviously there is some technical issue with the site, but my question is: why the vast difference between what the SEOMOZ crawl report and webmaster tools report show? Thanks! Guy Cizner
Technical SEO | | ciznerguy0 -
Why are my links not being counted?
I have a site that has over 400 links going to it. When I use Moz open site explorer or any other SEO tool its says I have only 12 links. Does anyone know why this could be happening?
Technical SEO | | Goopping0 -
What loss of value would this link experience?
What loss of value would this link experience? If the link is actually a link to the from site that is 301'd to your site like this example below: i.e., www.domain.com/29834?=www.yourdomain.com My thought is that simply because you're going through a redirect (In this case a 301) you will lose slight value there. But I'd love to hear your thoughts and reasoning on any other affects if any (direct or indirect) you think it may have.
Technical SEO | | Webfor1 -
Which version of pages should I build links to?
I'm working on the site www.qualityauditor.co.uk which is built in Moonfruit. Moonfruit renders pages in Flash. Not ideal, I know, but it also automatically produces an HTML version of every page for those without Flash, Javascript and search engines. This HTML version is fairly well optimised for search engines, but sits on different URLs. For example, the page you're likely to see if browsing the site is at http://www.qualityauditor.co.uk/#/iso-9001-lead-auditor-course/4528742734 However, if you turn Javascript off you can see the HTML version of the page here <cite>http://www.qualityauditor.co.uk/page/4528742734</cite> Mostly, it's the last version of the URL which appears in the Google search results for a relevant query. But not always. Plus, in Google Webmaster Tools fetching as Googlebot only shows page content for the first version of the URL. For the second version it returns HTTP status code and a 302 redirect to the first version. I have two questions, really: Will these two versions of the page cause my duplicate content issues? I suspect not as the first version renders only in Flash. But will Google think the 302 redirect for people is cloaking? Which version of the URL should I be pointing new links to (bearing in mind the 302 redirect which doesn't pass link juice). The URL's which I see in my browser and which Google likes the look at when I 'fetch as Googlebot'. Or those Google shows in the search results? Thanks folks, much appreciated! Eamon
Technical SEO | | driftnetmedia0