Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Do 404 Pages from Broken Links Still Pass Link Equity?
-
Hi everyone, I've searched the Q&A section, and also Google, for about the past hour and couldn't find a clear answer on this.
When inbound links point to a page that no longer exists, thus producing a 404 Error Page, is link equity/domain authority lost?
We are migrating a large eCommerce website and have hundreds of pages with little to no traffic that have legacy 301 redirects pointing to their URLs. I'm trying to decide how necessary it is to keep these redirects. I'm not concerned about the page authority of the pages with little traffic...I'm concerned about overall domain authority of the site since that certainly plays a role in how the site ranks overall in Google (especially pages with no links pointing to them...perfect example is Amazon...thousands of pages with no external links that rank #1 in Google for their product name).
Anyone have a clear answer? Thanks!
-
First off, thanks everyone for your replies

I'm well versed in best practices of 301 redirects, sitemaps, etc, etc. In other words, I fully know the optimal way to handle this. But, this is one of those situations where there are so many redirects involved (thousands) for a large site, that I want to make sure that what we are doing is fully worth the development time.
We are migrating a large website that was already migrated to a different CMS several years ago. There are thousands of legacy 301 redirects already in place for the current site, and many of those pages that are being REDIRECTED TO (from the old URL versions) receive very little/if any traffic. We need to decide if the work of redirecting them is worth it.
I'm not as worried about broken links for pages that don't get any traffic (although we ideally want 0 broken links). What I am most worried about, however, is losing domain authority and the whole site potentially ranking a little bit lower overall as a result.
Nakul's response (and Frederico's) are closest to what I am asking...but everyone is suggesting the same thing...that we will lose domain authority (example measurement: SEOmoz's OpenSiteExplorer domain authority score) if we don't keep those redirects in place (but of course, avoiding double redirects).
So, thanks again to everyone on this thread
If anyone has a differing opinion, I'd love to hear it...but this is pretty much what I expected: everyone's best educated assessment is that you will lose domain authority when 301 redirects are lifted and broken links are the end result. -
Great question Dan. @Jesse, you are on the right track. I think the question was misunderstood.
The question is, if seomoz.org links to Amazon.com/nakulgoyal and that page does not exist, is there link juice flow ? Think about it. It's like thinking about a citation. If seomoz.org mentions amazon.com/nakulgoyal, but does not actually have the hyperlink, is there citation flow.
So my question to the folks is, is there citation flow ? In my opinion, the answer is yes. There's some DA that will get passed along. Eventually, the site owner might identify the 404, "which they should" and setup a 301 redirect from Amazon.com/nakulgoyal to whatever pages makes most sense for the user, in which case there will be a proper link juice flow.
So to clarify what I said:
-
Scenario 1:
SiteA.com links to SiteB.com/urldoesnotexist - There is some (maybe close to negligible) domain authority flow. from siteA.com to siteB.com (Sort of like a link citation). There may not be a proper link juice flow, because the link is broken. -
Scenario 2:
SiteA.com links to SiteB.com/urldoesnotexist and this URL is 301 redirected SiteB.com/urlexists - In this case, there is both a authority flow and a link juice flow from SiteA.com to SiteB.com/urlexists
**That's my opinion. Think about it, the 301 redirect from /urldoesnotexist to /urlexists might get added 1 year from now and might be mistakenly removed at some point temporarily. There's going to be an affect in both cases. So in my opinion, the crux is, watch your 404's and redirect them when you and when it makes sense for the user. That way you have a good user experience and you can have the link juice flow where it should. **
-
-
Ideally you want to keep the number of 404 pages low because it tells the search engine that the page is a dead end, ask any SEO, it's best to keep the number of 404's as low as possible.
Link equity tells Google why to rank a page or give the root domain more authority. However, Google does not want users to end up on dead pages. So it will not help the site, rather hurt it. My recommendation is to create a sitemap and submit to Google WMT with the pages you want the spiders to index.
Limit the 404's as much as possible and try to 301 them if possible to a relevant page (from a user perspective).
-
I think, and correct me if I'm wrong Dan, you guys are misunderstanding the question.
He means that if you do actually create a 404 page for all your broken links to land on, will the juice pass from there to your domain (housing the 404 page) and on to whatever internal links you've built into said 404 page.
The answer, I think, is no. Reason for this is 404 is a status code returned before the 404 page is produced. Link juice can pass through either links (200) or redirects (301).
Again... I THINK.
Was this more what you were asking?
-
Equity is passed to a 404 page, which does not exist, therefore that equity is lost.
-
Thanks, Bryan. This doesn't really answer the exact question, though: is link equity still passed (and domain authority preserved) by broken links producing 404 Error Pages?
-
No they don't. Search engine spiders follow the link as a user, if the pages no longer exist and you cannot forward the user to a better page then create a good 404 page that will keep the users intrigued.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Page with metatag noindex is STILL being indexed?!
Hi Mozers, There are over 200 pages from our site that have a meta tag "noindex" but are STILL being indexed. What else can I do to remove them from the Index?
Intermediate & Advanced SEO | | yaelslater0 -
Should my back links go to home page or internal pages
Right now we rank on page 2 for many KWs, so should i now focus my attention on getting links to my home page to build domain authority or continue to direct links to the internal pages for specific KWs? I am about to write some articles for several good ranking sites and want to know whether to link my company name (same as domain name) or KW to the home page or use individual KWs to the internal pages - I am only allowed one link per article to my site. Thanks Ash
Intermediate & Advanced SEO | | AshShep10 -
Do search engines crawl links on 404 pages?
I'm currently in the process of redesigning my site's 404 page. I know there's all sorts of best practices from UX standpoint but what about search engines? Since these pages are roadblocks in the crawl process, I was wondering if there's a way to help the search engine continue its crawl. Does putting links to "recent posts" or something along those lines allow the bot to continue on its way or does the crawl stop at that point because the 404 HTTP status code is thrown in the header response?
Intermediate & Advanced SEO | | brad-causes0 -
Do links to PDF's on my site pass "link juice"?
Hi, I have recently started a project on one of my sites, working with a branch of the U.S. government, where I will be hosting and publishing some of their PDF documents for free for people to use. The great SEO side of this is that they link to my site. The thing is, they are linking directly to the PDF files themselves, not the page with the link to the PDF files. So my question is, does that give me any SEO benefit? While the PDF is hosted on my site, there are no links in it that would allow a spider to start from the PDF and crawl the rest of my site. So do I get any benefit from these great links? If not, does anybody have any suggestions on how I could get credit for them. Keep in mind that editing the PDF's are not allowed by the government. Thanks.
Intermediate & Advanced SEO | | rayvensoft0 -
Is this link follow or nofollow? Does it pass linkjuice?
I have been seeing conflicting opinions about how Google would treat links using 'onclick'. For the example provided below: Would Google follow this link and pass the appropriate linking metrics(it is internal and points to a deeper level in our visnav)? =-=-=-=-=-=-= <div id='<a class="attribute-value">navBoxContainer</a>' class="<a class="attribute-value">textClass</a>"> <div id="<a class="attribute-value">boxTitle</a>" onclick="<a class="attribute-value">location.href='bla</a>h.example.com"> <div class="<a class="attribute-value">boxTitleContent</a>" title="<a class="attribute-value">Text Here</a>"><a href<a class="attribute-value">Text Here</a>"><a ="blah.exam.cpleom">Text Herea>div> ``` =-=-=-=-=-=-= An simple yes/no would be alright, but any detail/explination you could provide would be helpful and very much appreciated. Thank you all for your time and responses.
Intermediate & Advanced SEO | | TLM0 -
Links from new sites with no link juice
Hi Guys, Do backlinks from a bunch of new sites pass any value to our site? I've heard a lot from some "SEO experts" say that it is an effective link building strategy to build a bunch of new sites and link them to our main site. I highly doubt that... To me, a new site is a new site, which means it won't have any backlinks in the beginning (most likely), so a backlink from this site won't pass too much link juice. Right? In my humble opinion this is not a good strategy any more...if you build new sites for the sake of getting links. This is just wrong. But, if you do have some unique content and you want to share with others on that particular topic, then you can definitely create a blog and write content and start getting links. And over time, the domain authority will increase, then a backlink from this site will become more valuable? I am not a SEO expert myself, so I am eager to hear your thoughts. Thanks.
Intermediate & Advanced SEO | | witmartmarketing0 -
Dynamic pages - ecommerce product pages
Hi guys, Before I dive into my question, let me give you some background.. I manage an ecommerce site and we're got thousands of product pages. The pages contain dynamic blocks and information in these blocks are fed by another system. So in a nutshell, our product team enters the data in a software and boom, the information is generated in these page blocks. But that's not all, these pages then redirect to a duplicate version with a custom URL. This is cached and this is what the end user sees. This was done to speed up load, rather than the system generate a dynamic page on the fly, the cache page is loaded and the user sees it super fast. Another benefit happened as well, after going live with the cached pages, they started getting indexed and ranking in Google. The problem is that, the redirect to the duplicate cached page isn't a permanent one, it's a meta refresh, a 302 that happens in a second. So yeah, I've got 302s kicking about. The development team can set up 301 but then there won't be any caching, pages will just load dynamically. Google records pages that are cached but does it cache a dynamic page though? Without a cached page, I'm wondering if I would drop in traffic. The view source might just show a list of dynamic blocks, no content! How would you tackle this? I've already setup canonical tags on the cached pages but removing cache.. Thanks
Intermediate & Advanced SEO | | Bio-RadAbs0 -
Generating 404 Errors but the Pages Exist
Hey I have recently come across an issue with several of a sites urls being seen as a 404 by bots such as Xenu, SEOMoz, Google Web Tools etc. The funny thing is, the pages exist and display fine. This happens on many of the pages which use the Modx CMS, but the index is fine. The wordpress blog in /blog/ all works fine. The only thing I can think of is that I have a conflict in the htaccess, but troubleshooting this is difficult, any tool I have found online seem useless. Have tried to rollback to previous versions but still does not work. Anyone had any experience of similar issues? Many thanks K.
Intermediate & Advanced SEO | | Found0