Broken Links from Open Site Explorer
-
I am trying to find broken internal links within my site. I found a page that was non-existent but had a bunch of internal links pointing to that page, so I ran an Open Site Explorer report for that URL, but it's limited to 25 URLs.
Is there a way to get a report of all of my internal pages that link to this invalid URL? I tried using the link: search modifier in Google, but that shows no responses.
-
Whew! Big thread.
Sometimes, when you can't find all the broken links to a page, it's easier simply to 301 redirect the page to a destination of your choice. This helps preserve link equity, even for those broken links you can't find on large sites. (and external links, as well)
Not sure if this would help in your situation, but I hope you're getting things sorted out!
-
Jesse,
That's where I started my search, but GWMT wasn't showing this link. I can only presume that because it isn't coming back a 404 (it is showing that "We're Sorry" message instead) that they're considering that message to be content.
Thanks!
-
Lynn, that was a BIG help. I had been running that report, but was restricted to 25 responses. When I saw your suggestion to filter for only internal links, I was able to see all 127.
Big props. Thanks!
-
One more thing to add - GWMT should report all 404 links and their location/referrer.
-
oops! i did not know this. Thanks Irving.
-
Use the word FREE with an asterisk because sreaming frog is now limiting the free version to 500 pages. Xenu is better, even brokenlinkcheck.com lets you spider 3000 pages.
500 pages makes the tool practically worthless for any site of decent size.
-
Indeed if it is not showing a 404, that makes things a bit difficult!
You could try another way, use OSE!
Use the exact page, filter for only internal links, boom 127 pages that link to it. There might be more, but this should get you going!
-
Jesse:
I appreciate your feedback, but am surprised that the ScreamingFrog report found no 404s. SEOmoz found 15 in Roger's last crawl, but those aren't the ones that I'm currently trying to solve.
The problem page is actually showing up as duplicate content, which is kinda screwy. When visiting the page, our normal 404 error doesn't appear (which our developers are still trying to figure out), but instead, an error message appears:
http://www.gallerydirect.com/about-us/media-birchwood
If this were a normal 404 page, we'd probably be able to find the links faster.
-
I got tired of the confusion and went ahead and proved it. Not sure if this is the site you wanted results for, but I used the site linked in your profile (www.gallerydirect.com)
took me about 90 seconds and I had a full list... no 404s though
anyway here's a screenshot to prove it:
http://gyazo.com/67b5763e30722a334f3970643798ca62.png
so what's the problem? want me to crawl the fbi site next?
-
I understand. Thing is, there is a way and the spider doesn't affect anything. Like I said, I have screaming frog installed on my computer and I could run a report for your website right now and you or your IT department would never know it happened.. I just don't understand the part where the software doesn't work for you but to each their own i suppose.
-
Jesse:
That movie was creepy, but John Goodman was awesome in it.
I started this thread because I was frustrated that OSE restricts my results to 25 links, and I simply wanted to find the rest for that particular URL. I was assuming that there was either:
a. A method for getting the rest of the links that Roger found
b. Another way of pulling these reports from someone who already spiders them (since I can't get any using the link:[URL] in Google and Webmaster Tools isn't showing them).
Thanks to all for your suggestions.
-
run the spider based app from outside their "precious network" then. hell, i could run it right now for you from my computer at work if I wanted. Use your laptop or home computer. It's a simple spider you don't have to be within any network to run it. You could run one for CNN.com if you'd like as well...
-
How else do you expect to trace these broken links without using a "spider?"
Obviously it's the solution. And the programs take up all of like 8 megs... so what's the problem/concern?
I second the screaming frog solution. It will tell you exactly what you need to know and has ZERO risk involved (or whatever it is that's hanging you up). The bazooka comparison is ridiculous, because a bazooka destroys your house. Do you really think a spider crawl will affect your website?
Spiders crawl your site and report findings. This happens often whether you download a simple piece of software or not. What do you think OSE is? Or Google?
I guess what we're saying is if you don't like the answer, then so be it. But that's the answer.
PS - OSE uses a spider to crawl your site...
PPS - Do you suffer from arachnophobia? That movie was friggin awesome now I want to watch old Jeff Daniels films.
PPSS - Do you guys remember John Goodman being in that movie? Wow the late 80s early 90s were really somethin' special.
-
John, I certainly see your point, but our IT guys would not take too kindly to me running a spider-based app from inside their precious network, which is why I was looking for a less intrusive solution.
I'm planning on a campaign to revive "flummoxed" to the everyday lexicon next.
-
Hi Darin,
Both these softwares are made for exactly this kind of job and they are not huge system killing programs or anything. Seriously I use one or both almost every day. I suggest downloading them and seeing how you go, I think you will be happy enough with the results.
-
The way I see it, its much like you missing the last flight home, and you have a choice of getting the bus, that means you might take a little longer, or of course you can wait for the next flight ,which happens to be tomorrow evening, the bus will get you home that night.
I get the bus each and every time, I get home, later than expected I grant you, but I get home a lot quicker than waiting for the plane tomorrow.
Bewildered, I didn't realise it had fallen out of the diction, its a common word (I think) in Ireland, oh and I am still young (ish)
-
John:
Bewildered. There's a good word that I'm happy to see someone is keeping it alive for the younger generations.
I'm not ungrateful for your suggestions, but both involve downloading and installing a spider, which seems like overkill, much like using a bazooka to kill a housefly.
-
I am bewildered by this, I have told you one, Lynn has told you another piece of free software that will do this for you.
Anyway, good luck with however you resolve our issues
-
Lynn, part of the problem is definitely template-based, and one of our developers is working on that fix now. However, I also found a number of non-template created links to this page simply due to UBD error (an old Cobol programming term meaning User Brain Dead).
I need to find all of the non-template based, UBD links that may have been created and fix them.
-
Xenu will also do a similar job and doesn't have a limit which I recall the free version of screaming frog has: http://home.snafu.de/tilman/xenulink.html
If you have loads of links to this missing page it sounds like you maybe have a template problem with the links getting inserted on every or lots of pages. In that case if you find the point in the template you will have fixed them all at once (if indeed it is like this).
-
Darin
Its a stand alone piece of software you run, it crawls your website and finds out broken inbound, outbound or internal links, tells you them ,you go and fix them
Enter your URL, be it a page or directory, run it, it will give you all bad links. And it wont limit you to 25.
You don't need to implement anything ... run the software once, use it, and well bin it afterwards if you wish
But by all means, you can do as you suggest with SE ...
Regards
John
-
John,
While I could look at implementing such a spider to run the check sitewide on a regular basis, I am not looking to go that far at the moment. For right now, I'm only looking for all of the pages on my site that link to a single incorrect URL. I would have to think that there's a solution available for such a limited search.
If I have to, I suppose I can fix the 25 that Open Site Explorer displays, wait a few days for the crawler to run again, then run the report again, fix the next 25, then so on and so on, but that's going to spread the fix out potentially over a number of weeks.
-
Free tool, non SEO Moz related
http://www.screamingfrog.co.uk/seo-spider/ , run that, will find all broken links, where they are coming from etc etc
Hope I aint braking any rules posting it
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Base href + relative link href for canonical link
I have a site that in the head section we specify a base href being the domain with a trailing slash and a canonical link href being the relative link to the domain. <base <="" span="">href="http://www.domain.com/" /> href="link-to-page.html" rel="canonical" /> I know that Google recommends using an absolute path as a canonical link but is specifying a base href with a relative canonical link the same thing or is it still seen as duplicate content?
Intermediate & Advanced SEO | | Nobody16116990439410 -
When Mobile and Desktop sites have the same page URLs, how should I handle the 'View Desktop Site' link on a mobile site to ensure a smooth crawl?
We're about to roll out a mobile site. The mobile and desktop URLs are the same. User Agent determines whether you see the desktop or mobile version of the site. At the bottom of the page is a 'View Desktop Site' link that will present the desktop version of the site to mobile user agents when clicked. I'm concerned that when the mobile crawler crawls our site it will crawl both our entire mobile site, then click 'View Desktop Site' and crawl our entire desktop site as well. Since mobile and desktop URLs are the same, the mobile crawler will end up crawling both mobile and desktop versions of each URL. Any tips on what we can do to make sure the mobile crawler either doesn't access the desktop site, or that we can let it know what is the mobile version of the page? We could simply not show the 'View Desktop Site' to the mobile crawler, but I'm interested to hear if others have encountered this issue and have any other recommended ways for handling it. Thanks!
Intermediate & Advanced SEO | | merch_zzounds0 -
Should I remove all vendor links (link farm concerns)?
I have a web site that has been around for a long time. The industry we serve includes many, many small vendors and - back in the day - we decided to allow those vendors to submit their details, including a link to their own web site, for inclusion on our pages. These vendor listings were presented in location (state) pages as well as more granular pages within our industry (we called them "topics). I don't think it's important any more but 100% of the vendors listed were submitted by the vendors themselves, rather than us "hunting down" links for inclusion or automating this in any way. Some of the vendors (I'd guess maybe 10-15%) link back to us but many of these sites are mom-and-pop sites and would have extremely low authority. Today the list of vendors is in the thousands (US only). But the database is old and not maintained in any meaningful way. We have many broken links and I believe, rightly or wrongly, we are considered a link farm by the search engines. The pages on which these vendors are listed use dynamic URLs of the form: \vendors<state>-<topic>. The combination of states and topics means we have hundreds of these pages and they thus form a significant percentage of our pages. And they are garbage 🙂 So, not good.</topic></state> We understand that this model is broken. Our plan is to simply remove these pages (with the list of vendors) from our site. That's a simple fix but I want to be sure we're not doing anything wring here, from an SEO perspective. Is this as simple as that - just removing these page? How much effort should I put into redirecting (301) these removed URLs? For example, I could spend effort making sure that \vendors\California- <topic>(and for all states) goes to a general "topic" page (which still has relevance, but won't have any vendors listed)</topic> I know there is no distinct answer to this, but what expectation should I have about the impact of removing these pages? Would the removal of a large percentage of garbage pages (leaving much better content) be expected to be a major factor in SEO? Anyway, before I go down this path I thought I'd check here in case I miss something. Thoughts?
Intermediate & Advanced SEO | | MarkWill0 -
Do links from twitter count in SEOMoz's Toolbar link count?
I am using the Chrome extension and looking at a SERP, when a page is said to have 2000 incoming links, does that include tweets with a link back to this page? What about retweets. Are those counted separately or as one? And what about independent tweets that have exactly the same content (tweet text + link)
Intermediate & Advanced SEO | | davhad0 -
Google penalized site--307/302 redirect to new site-- Via intermediate link—New Site Ranking Gone..?
Hi, I have a site that google had placed a manual link penalty on, let’s call this our
Intermediate & Advanced SEO | | Robdob2013
company site. We tried and tried to get the penalty removed, and finally gave up and purchased another name. It was our understanding that we could safely use either a 302 or 307 temporary redirect in order to redirect people from our old domain to our new one.. We put this into place several months and everything seemed to be going along well. Several days ago I noticed that our root domain name had dropped for our selected keyword from position 9 to position 65. Upon looking into our GWT under “Links to Your site” , I have found many, many, many links which were pointed to our old google penalized domain name to our new root domain name each of this links had a sub heading “Via this intermediate link -> Our Old Domain Google Penalized Domain Name” In light of all of this going on, I have removed the 307/302 redirect, have brought the
old penalized site back which now consists of a basic “we’ve moved page” which is linked to our new site using a rel=’nofollow’ I am hoping that -1- Our new domain has probably not received a manual penalty and is most likely now
received some sort of algorithmic penalty, and that as these “intermediate links” will soon disappear because I’m no longer doing the 302/307 from the old sight to the new. Do you think this is the case now or that I now have a new manual penalty place on the new
domain name.. I would very much appreciate any comments and/or suggestions as to what I should or can do to get this fixed. I need to still keep the old domain name as this address has already been printed on business cards many, many years ago.. Also on a side note some of the sub pages of the new root domain are still ranking very
well, it’s only the root domain that is now racking awfully.. Thanks,0 -
What is value in a back-link from article with multiple links pointing to various other sites?
In a standard article with 400-500 words my site got a back-link. However, within the article there are 4 other links pointing to other external content as well (so total 5 links within articles all pointing to external sites, and 1 of the links is to my site). All links are to relevant external content that is. Question: wouldn't it be much more valuable for my site if only my site got a back-link from the article, as less link juice is now passed to my site, since there are 4 other links pointing to various sites from this same article? Or, is the case that given the other links are pointing to quality material it actually makes the link to my site look more credible and at the end of the day have more value. Conclusion: is it that on one hand less links in same article is better from a link juice perspective, however, from a credibility perspective it looks more convincing there are other links pointing to quality content?
Intermediate & Advanced SEO | | knielsen0 -
2 sites or one sites: 2 locations
Hello, I have a dog training client who is offering services in 2 separate locations. We're looking to be first in the non-local search results and also rank well in google places. Would it be better to go for 2 separate sites or one site and try to rank for 2 different locations with one site? There's both local and standard search results when we type in our keywords. Thanks!
Intermediate & Advanced SEO | | BobGW0 -
Would it be ok if my ccTLD (.au) has links pointing to my .com (main) site?
The main pages of my .au site are all in .au, but once you go to the inner pages, the users will be directed to my .com site. The .com will act as the content for the top pages of the .au. Would that be ok?
Intermediate & Advanced SEO | | MicroSourcing_PRM0