Broken Inner Links - Tool Recommendations?
-
Do you have any recommendations for tools that scan an entire website and report broken inner links?
I run several UGC centered websites and broken inner links, and external, is an issue.
Being that these websites are several hundred thousand pages large, I am not really all that excited about running software on my desktop (xenu link sleuth for example). Any online solutions you could recommend would be great!
-
If it happens to be a wordpress site, there is a plugin called something like "Broken Link Checker." If I recall correctly, that checks internal and outbound links. Otherwise, not too sure.
-
Ideric, did any of these suggestions answer your questions, or have you been able to otherwise find a tool for this? I know others would find the information useful.
At a previous company, we had a custom-written solution to check external links, and made it check response headers until a 200 OK showed, or it got five levels deep. What we'd often find is that we'd have a 301 for an external link, and it'd go from non-www to www. Wouldn't necessarily worry about fixing that, but then later realized that from there, the www link was a 404, OR went to a 200 OK category landing page that said "we've reorganized our site, search here for that individual resource".
-
Well you've found the best solution right here at SEOmoz! Instead of wasting time learning new systems to find out if they'll work or not, just solve your problem. Sign with PRO Elite and you can crawl 100,000 pages.
-
I have used this in the past http://www.auditmypc.com/free-sitemap-generator.asp - (Click on the image in the top right of the instructions) a free tool for site map generation that will show broken internal links in the process. I don't think it has any limits to it, although I have not tried it on a site as large as you are suggesting. Just ensure you are not logged into your site when you run it. Although Google webmaster tools is ok, you can't verify changes made very quickly.
-
I think Xenu is your best option here. The size of the site nearly cuts out the chance a web tool could handle it.
Just recently on a site review I had to run Xenu on a site with 160,000 pages. It only took 4 hours running at 30 threads to complete. Any modern PC should handle it fine.
-
WMT is alright, apart from the fact you can't force Google to crawl all your pages. I would doubt that even a majority of the pages were crawled and indexed by Google (though I don't know what the site is).
Plus, as you say, it only deals with internal links and 404s coming in.
Do you know what the upper limit is on how many crawl errors WMT will display?
-
I might be wrong, but I think Google WMT can accomplish this with ease. I'm looking at 1000 right now. Externally you'll probably have to use xenu =/
-
You might be out of luck on a site that size.
I think WebCEO can do this with their online version but to get 100,000 urls crawled I think it'll cost you a bomb (the sort of money that it'd be cheaper to buy a second PC to run Xenu, lol).
Anyway - http://www.webceo.com/ - I think it may also be possible to install the download version to a server and run it that way.
-
I use Google webmaster tools. Go to diagnostics, then crawl errors.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does reciprocal linking carry any value?
No matter how much I research this one, there's no definite answer and there's a lot of contradictions. Basically we're looking to launch an article on 24 expert interior design tips for 2015. Each tip is submitted from a different interior designer we have chosen who have a reputable, trusted website. The main goal for this article is to generate various inbound links for our site from the designers and it will help to create engagement on social media. Although if we're giving out links to these designers for their contributions, the inbound links we receive in return will be little or no value as this is reciprocal linking? Some say this is okay as it's completely natural within the blog posts, others say to avoid it as it can be seen as an obsolete practice to deceive Google. Does anyone have any more information on this and how it should be carried out? Would a better process be to link to their social media accounts? Rather than reciprocal linking? Thanks
Technical SEO | | Jseddon920 -
302 redirected links not found
There are so many 302 redirected links you found among which most are for the pages which needs users to login to view the pages so redirection to login page is unavoidable. For example: https://www.stopwobble.com/wishlist/index/add/product/98199/form_key/QE0kEzOF2yO3DTtt/ Also we don't have product compare functionlity, but still there are so many links from compare page which redirects to respective category page. For exammple: http://www.stopwobble.com/catalog/product_compare/add/product/98199/uenc/aHR0cDovL3d3dy5zdG9wd29iYmxlLmNvbS93b2JibGUtd2VkZ2Vz/form_key/QE0kEzOF2yO3DTtt/ We need to know from where Moz crawler is detecting these links so that we can supress them from being crawled. I already tries to review overall site and confirmed these links nowhere exists in page source or in sitemap.xml
Technical SEO | | torbett0 -
Webmaster tools crawl stats
Hi I have a clients site that was having aprox 30 - 50 pages crawled regularly since site launch up until end of Jan. On the 21st Jan the crawled pages dropped significantly from this average to about 11 - 20 pages per day. This also coincided with a massive rankings drop on the 22nd which i thought was something to do with panda although it later turned out the hosts had changed the DNS and exactly a week after fixing it the rankings returned so i think that was the cause not panda. However i note that the crawl rate still hasn't returned to what it was/previous average and is still following the new average of 10-20 pages per day rather than the 30-50 pages per day. Does anyone have any ideas why this is ? I have since added a site map but hasnt increased crawl rate since A bit of further info if it helps in any way is that In the indexed status section says 48 pages ever crawled with 37 pages indexed. There are 48 pages on the site. The site map section says 37 submitted with 35 indexed. I would have thought that since dynamic site map would submit all urls Any clarity re the above much appreciated ? Cheers Dan
Technical SEO | | Dan-Lawrence0 -
Is it a problem to have an image + link in your menu
Hi, My menu has a image with links to some of the main pages on the site and text underneath it explaining what the banner is. Will it be beneficial or harmful to have the text hyperlinked to the same pages the images go to?
Technical SEO | | theLotter0 -
Page for Link Building
Hello guys, My question is about link building and reciprocal links. Since many directories request a reciprocal link, makes me wonder if is not better to create a unique page in the website only for this kind of links. What do you guys recommend? Thanks in advance, PP
Technical SEO | | PedroM0 -
Link Volume - calculate what you need?
Hi everyone, an interesting question here. How do you determien what link volume you should try and get into your website? What analysis do you do to determine the number of links you feel is right to go into a back-link profiel every month? obviously there is no magic number but its an interesting question to know what others do. Obviously you don't want to build too many or too little. If you have been penalised for bad links in the past and are now back on track - how do you calculate the volume? Do you take links dropping out into consideration?
Technical SEO | | pauledwards0 -
External Links from own domain
Hi all, I have a very weird question about external links to our site from our own domain. According to GWMT we have 603,404,378 links from our own domain to our domain (see screen 1) We noticed when we drilled down that this is from disabled sub-domains like m.jump.co.za. In the past we used to redirect all traffic from sub-domains to our primary www domain. But it seems that for some time in the past that google had access to crawl some of our sub-domains, but in december 2010 we fixed this so that all sub-domain traffic redirects (301) to our primary domain. Example http://m.jump.co.za/search/ipod/ redirected to http://www.jump.co.za/search/ipod/ The weird part is that the number of external links kept on growing and is now sitting on a massive number. On 8 April 2011 we took a different approach and we created a landing page for m.jump.co.za and all other requests generated 404 errors. We added all the directories to the robots.txt and we also manually removed all the directories from GWMT. Now 3 weeks later, and the number of external links just keeps on growing: Here is some stats: 11-Apr-11 - 543 747 534 12-Apr-11 - 554 066 716 13-Apr-11 - 554 066 716 14-Apr-11 - 554 066 716 15-Apr-11 - 521 528 014 16-Apr-11 - 515 098 895 17-Apr-11 - 515 098 895 18-Apr-11 - 515 098 895 19-Apr-11 - 520 404 181 20-Apr-11 - 520 404 181 21-Apr-11 - 520 404 181 26-Apr-11 - 520 404 181 27-Apr-11 - 520 404 181 28-Apr-11 - 603 404 378 I am now thinking of cleaning the robots.txt and re-including all the excluded directories from GWMT and to see if google will be able to get rid of all these links. What do you think is the best solution to get rid of all these invalid pages. moz1.PNG moz2.PNG moz3.PNG
Technical SEO | | JacoRoux0