Broken Links from Open Site Explorer
-
I am trying to find broken internal links within my site. I found a page that was non-existent but had a bunch of internal links pointing to that page, so I ran an Open Site Explorer report for that URL, but it's limited to 25 URLs.
Is there a way to get a report of all of my internal pages that link to this invalid URL? I tried using the link: search modifier in Google, but that shows no responses.
-
Whew! Big thread.
Sometimes, when you can't find all the broken links to a page, it's easier simply to 301 redirect the page to a destination of your choice. This helps preserve link equity, even for those broken links you can't find on large sites. (and external links, as well)
Not sure if this would help in your situation, but I hope you're getting things sorted out!
-
Jesse,
That's where I started my search, but GWMT wasn't showing this link. I can only presume that because it isn't coming back a 404 (it is showing that "We're Sorry" message instead) that they're considering that message to be content.
Thanks!
-
Lynn, that was a BIG help. I had been running that report, but was restricted to 25 responses. When I saw your suggestion to filter for only internal links, I was able to see all 127.
Big props. Thanks!
-
One more thing to add - GWMT should report all 404 links and their location/referrer.
-
oops! i did not know this. Thanks Irving.
-
Use the word FREE with an asterisk because sreaming frog is now limiting the free version to 500 pages. Xenu is better, even brokenlinkcheck.com lets you spider 3000 pages.
500 pages makes the tool practically worthless for any site of decent size.
-
Indeed if it is not showing a 404, that makes things a bit difficult!
You could try another way, use OSE!
Use the exact page, filter for only internal links, boom 127 pages that link to it. There might be more, but this should get you going!
-
Jesse:
I appreciate your feedback, but am surprised that the ScreamingFrog report found no 404s. SEOmoz found 15 in Roger's last crawl, but those aren't the ones that I'm currently trying to solve.
The problem page is actually showing up as duplicate content, which is kinda screwy. When visiting the page, our normal 404 error doesn't appear (which our developers are still trying to figure out), but instead, an error message appears:
http://www.gallerydirect.com/about-us/media-birchwood
If this were a normal 404 page, we'd probably be able to find the links faster.
-
I got tired of the confusion and went ahead and proved it. Not sure if this is the site you wanted results for, but I used the site linked in your profile (www.gallerydirect.com)
took me about 90 seconds and I had a full list... no 404s though
anyway here's a screenshot to prove it:
http://gyazo.com/67b5763e30722a334f3970643798ca62.png
so what's the problem? want me to crawl the fbi site next?
-
I understand. Thing is, there is a way and the spider doesn't affect anything. Like I said, I have screaming frog installed on my computer and I could run a report for your website right now and you or your IT department would never know it happened.. I just don't understand the part where the software doesn't work for you but to each their own i suppose.
-
Jesse:
That movie was creepy, but John Goodman was awesome in it.
I started this thread because I was frustrated that OSE restricts my results to 25 links, and I simply wanted to find the rest for that particular URL. I was assuming that there was either:
a. A method for getting the rest of the links that Roger found
b. Another way of pulling these reports from someone who already spiders them (since I can't get any using the link:[URL] in Google and Webmaster Tools isn't showing them).
Thanks to all for your suggestions.
-
run the spider based app from outside their "precious network" then. hell, i could run it right now for you from my computer at work if I wanted. Use your laptop or home computer. It's a simple spider you don't have to be within any network to run it. You could run one for CNN.com if you'd like as well...
-
How else do you expect to trace these broken links without using a "spider?"
Obviously it's the solution. And the programs take up all of like 8 megs... so what's the problem/concern?
I second the screaming frog solution. It will tell you exactly what you need to know and has ZERO risk involved (or whatever it is that's hanging you up). The bazooka comparison is ridiculous, because a bazooka destroys your house. Do you really think a spider crawl will affect your website?
Spiders crawl your site and report findings. This happens often whether you download a simple piece of software or not. What do you think OSE is? Or Google?
I guess what we're saying is if you don't like the answer, then so be it. But that's the answer.
PS - OSE uses a spider to crawl your site...
PPS - Do you suffer from arachnophobia? That movie was friggin awesome now I want to watch old Jeff Daniels films.
PPSS - Do you guys remember John Goodman being in that movie? Wow the late 80s early 90s were really somethin' special.
-
John, I certainly see your point, but our IT guys would not take too kindly to me running a spider-based app from inside their precious network, which is why I was looking for a less intrusive solution.
I'm planning on a campaign to revive "flummoxed" to the everyday lexicon next.
-
Hi Darin,
Both these softwares are made for exactly this kind of job and they are not huge system killing programs or anything. Seriously I use one or both almost every day. I suggest downloading them and seeing how you go, I think you will be happy enough with the results.
-
The way I see it, its much like you missing the last flight home, and you have a choice of getting the bus, that means you might take a little longer, or of course you can wait for the next flight ,which happens to be tomorrow evening, the bus will get you home that night.
I get the bus each and every time, I get home, later than expected I grant you, but I get home a lot quicker than waiting for the plane tomorrow.
Bewildered, I didn't realise it had fallen out of the diction, its a common word (I think) in Ireland, oh and I am still young (ish)
-
John:
Bewildered. There's a good word that I'm happy to see someone is keeping it alive for the younger generations.
I'm not ungrateful for your suggestions, but both involve downloading and installing a spider, which seems like overkill, much like using a bazooka to kill a housefly.
-
I am bewildered by this, I have told you one, Lynn has told you another piece of free software that will do this for you.
Anyway, good luck with however you resolve our issues
-
Lynn, part of the problem is definitely template-based, and one of our developers is working on that fix now. However, I also found a number of non-template created links to this page simply due to UBD error (an old Cobol programming term meaning User Brain Dead).
I need to find all of the non-template based, UBD links that may have been created and fix them.
-
Xenu will also do a similar job and doesn't have a limit which I recall the free version of screaming frog has: http://home.snafu.de/tilman/xenulink.html
If you have loads of links to this missing page it sounds like you maybe have a template problem with the links getting inserted on every or lots of pages. In that case if you find the point in the template you will have fixed them all at once (if indeed it is like this).
-
Darin
Its a stand alone piece of software you run, it crawls your website and finds out broken inbound, outbound or internal links, tells you them ,you go and fix them
Enter your URL, be it a page or directory, run it, it will give you all bad links. And it wont limit you to 25.
You don't need to implement anything ... run the software once, use it, and well bin it afterwards if you wish
But by all means, you can do as you suggest with SE ...
Regards
John
-
John,
While I could look at implementing such a spider to run the check sitewide on a regular basis, I am not looking to go that far at the moment. For right now, I'm only looking for all of the pages on my site that link to a single incorrect URL. I would have to think that there's a solution available for such a limited search.
If I have to, I suppose I can fix the 25 that Open Site Explorer displays, wait a few days for the crawler to run again, then run the report again, fix the next 25, then so on and so on, but that's going to spread the fix out potentially over a number of weeks.
-
Free tool, non SEO Moz related
http://www.screamingfrog.co.uk/seo-spider/ , run that, will find all broken links, where they are coming from etc etc
Hope I aint braking any rules posting it
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
New link explorer
I was checking this new tool which is really cool by the way and was wondering if I can outrank big guys with just content. I have a Domain authority of 28 with a spam score of 28 % Can I outrank with amazing content a site that hase a domain authority of 50 and a spam score of 1 % ? Should I ask for all my bad links to be removed so that my spam score goes down or doesn't it matter anymore those days and what matters is good content, link just don't count anymore ? Thank you,
Intermediate & Advanced SEO | | seoanalytics1 -
Link building
ok mozers i have a few questions. I am starting a new seo campaign and i want to target traffic for "how to make money on autopilot" Question 1. when it comes to link building i have seen some articles saying that i should not send all of my links to my landing page at once but to send links to my backlinks then index then using tiered link building. Is this a must or not? will i get penalized if i build 20 targeted links to my landing page in 1 day, lets say 20, pr7-9 domains? or should i tier it out and link maybe 5 pr9 domains to my landing page, then link 10 pr5 domains to each of those 5 pr9 domains and maybe link 20-pr1 domains to each of those tiered 2 pr5 domains? eq: Tier 1 = 5 PR-9 Tier 2 = 50 PR-5 Tier 3 = 1,000 PR-1 Question 2. Is their a certain amount of backlinks i need to use in order to out do my competitor? or does it just matter on the metrics of my backlinks? and when it comes to indexing these links do i need to index just the 5 pr9 links? or do i need to index all of them? or should i just index the landing page through google webmasters tools and hope it indexes all connecting pages? will doing any of these get my landing page indexed faster in order to rank faster? Question 3. Types of link building. Ok i am targeting guest blogs, wordpress sites, etc to put a link on. Should i focus on smm 'social media marketing' as well? or can i just focus on the traditional seo tactics first? Question 4. Keyword research. ok so my blog post is 'how to make money on autpilot' and from my keyword suggestion tools it picked up a list of keywords suggestions to target. Competition ranges from low to high, search volume ranges from 10 to 1900 visitors per month, after organizing the most relevant keywords to add to my campaign should i target each of the these keywords by creating a link building campaign for each one and target it to my landing page or use it as my 2nd or 3rd tier? those are the questions i really have for now. Here is my blog post http://www.vemomedia.com/how-to-make-money-on-autopilot/ Please feel me in on what i am needing to do in order to get some ranking and on how to run a link building campaign the correct way. Thanx in advance!
Intermediate & Advanced SEO | | djgbshows1 -
If I put a piece of content on an external site can I syndicate to my site later using a rel=canonical link?
Could someone help me with a 'what if ' scenario please? What happens if I publish a piece of content on an external website, but then later decide to also put this content on my website. I want my website to rank first for this content, even though the original location for the content was the external website. Would it be okay for me to put a rel=canonical tag on the external website's content pointing to the copy on my website? Or would this be seen as manipulative?
Intermediate & Advanced SEO | | RG_SEO1 -
Link Juice + multiple links pointing to the same page
Scenario
Intermediate & Advanced SEO | | Mark_Ch
The website has a menu consisting of 4 links Home | Shoes | About Us | Contact Us Additionally within the body content we write about various shoe types. We create a link with the anchor text "Shoes" pointing to www.mydomain.co.uk/shoes In this simple example, we have 2 instances of the same link pointing to the same url location.
We have 4 unique links.
In total we have 5 on page links. Question
How many links would Google count as part of the link juice model?
How would the link juice be weighted in terms of percentages?
If changing the anchor text in the body content to say "fashion shoes" have a different impact? Any other advise or best practice would be appreciated. Thanks Mark0 -
PR links
Its seems that at lot of or competitors are using PR site to place articles with links. They are using the same article across many sites with the same anchor text link - But they seem to be doing very well in the rankings.... I have steered away from this type of linking as I assumed Google wouldn't be keen on this type of activity but I seem to be wrong.... Any views on this?
Intermediate & Advanced SEO | | jj34340 -
Site dancing
Hi guys I have a site which is dancing. I mean one day is on position 20 , if I put more backlinks is falling, after rising again,, I dont know what is going on. The site is 2 years old, pr 2, authority 35. Why this is happening? Usually when he appears again is ranking higher, but today he disappear totally from rankings. Maybe return tomorrow? But anyway why is dancing? Thanks
Intermediate & Advanced SEO | | nyanainc0 -
Questions about turning my wordpress site into an ecommerce site. Experience needed.
I have a wordpress site that is about a product that is now getting some great traffic. Right now It has affiliate stuff on it. I want to sell my own product so I will be turning this wordpress site into an ecommerce site. I want to redesign it so I am not looking for simple plugins to just add a cart. The part I am really confused about is what to do with my posts and categories? How does that work when turning this site into an ecommerce site? Lets say the site is "hats for adults" My post pages are things like "funny hats for adults", "hats for adult men" etc etc. Would I turn these posts pages into like category pages that have a category of products. Or should I create real categories and have my developer turn those into the ecommerce category pages and then redirect my posts to those categories? Maybe I don't even know what I am talking about. Is this even making sense? This is a small site (5posts and 1 category) and most of the traffic will come from the homepage keywords anyways.
Intermediate & Advanced SEO | | PEnterprises0 -
Has anyone found a way to get site links in the SERPs?
I am wanting to get some site links in the serps to increase the size of my "space", has anyone found a way of getting them? I know google says that its automatic and only generated if they feel it would benifit browsers but there must be a rule of thumb to follow. I was thinking down the line of a tight catagorical system that is implimented throughout the site that is clearly related to the content (how it should be I guess)... Any comments, suggestions welcome
Intermediate & Advanced SEO | | CraigAddyman0