Do 404 Pages from Broken Links Still Pass Link Equity?
-
Hi everyone, I've searched the Q&A section, and also Google, for about the past hour and couldn't find a clear answer on this.
When inbound links point to a page that no longer exists, thus producing a 404 Error Page, is link equity/domain authority lost?
We are migrating a large eCommerce website and have hundreds of pages with little to no traffic that have legacy 301 redirects pointing to their URLs. I'm trying to decide how necessary it is to keep these redirects. I'm not concerned about the page authority of the pages with little traffic...I'm concerned about overall domain authority of the site since that certainly plays a role in how the site ranks overall in Google (especially pages with no links pointing to them...perfect example is Amazon...thousands of pages with no external links that rank #1 in Google for their product name).
Anyone have a clear answer? Thanks!
-
First off, thanks everyone for your replies
I'm well versed in best practices of 301 redirects, sitemaps, etc, etc. In other words, I fully know the optimal way to handle this. But, this is one of those situations where there are so many redirects involved (thousands) for a large site, that I want to make sure that what we are doing is fully worth the development time.
We are migrating a large website that was already migrated to a different CMS several years ago. There are thousands of legacy 301 redirects already in place for the current site, and many of those pages that are being REDIRECTED TO (from the old URL versions) receive very little/if any traffic. We need to decide if the work of redirecting them is worth it.
I'm not as worried about broken links for pages that don't get any traffic (although we ideally want 0 broken links). What I am most worried about, however, is losing domain authority and the whole site potentially ranking a little bit lower overall as a result.
Nakul's response (and Frederico's) are closest to what I am asking...but everyone is suggesting the same thing...that we will lose domain authority (example measurement: SEOmoz's OpenSiteExplorer domain authority score) if we don't keep those redirects in place (but of course, avoiding double redirects).
So, thanks again to everyone on this thread If anyone has a differing opinion, I'd love to hear it...but this is pretty much what I expected: everyone's best educated assessment is that you will lose domain authority when 301 redirects are lifted and broken links are the end result.
-
Great question Dan. @Jesse, you are on the right track. I think the question was misunderstood.
The question is, if seomoz.org links to Amazon.com/nakulgoyal and that page does not exist, is there link juice flow ? Think about it. It's like thinking about a citation. If seomoz.org mentions amazon.com/nakulgoyal, but does not actually have the hyperlink, is there citation flow.
So my question to the folks is, is there citation flow ? In my opinion, the answer is yes. There's some DA that will get passed along. Eventually, the site owner might identify the 404, "which they should" and setup a 301 redirect from Amazon.com/nakulgoyal to whatever pages makes most sense for the user, in which case there will be a proper link juice flow.
So to clarify what I said:
-
Scenario 1:
SiteA.com links to SiteB.com/urldoesnotexist - There is some (maybe close to negligible) domain authority flow. from siteA.com to siteB.com (Sort of like a link citation). There may not be a proper link juice flow, because the link is broken. -
Scenario 2:
SiteA.com links to SiteB.com/urldoesnotexist and this URL is 301 redirected SiteB.com/urlexists - In this case, there is both a authority flow and a link juice flow from SiteA.com to SiteB.com/urlexists
**That's my opinion. Think about it, the 301 redirect from /urldoesnotexist to /urlexists might get added 1 year from now and might be mistakenly removed at some point temporarily. There's going to be an affect in both cases. So in my opinion, the crux is, watch your 404's and redirect them when you and when it makes sense for the user. That way you have a good user experience and you can have the link juice flow where it should. **
-
-
Ideally you want to keep the number of 404 pages low because it tells the search engine that the page is a dead end, ask any SEO, it's best to keep the number of 404's as low as possible.
Link equity tells Google why to rank a page or give the root domain more authority. However, Google does not want users to end up on dead pages. So it will not help the site, rather hurt it. My recommendation is to create a sitemap and submit to Google WMT with the pages you want the spiders to index.
Limit the 404's as much as possible and try to 301 them if possible to a relevant page (from a user perspective).
-
I think, and correct me if I'm wrong Dan, you guys are misunderstanding the question.
He means that if you do actually create a 404 page for all your broken links to land on, will the juice pass from there to your domain (housing the 404 page) and on to whatever internal links you've built into said 404 page.
The answer, I think, is no. Reason for this is 404 is a status code returned before the 404 page is produced. Link juice can pass through either links (200) or redirects (301).
Again... I THINK.
Was this more what you were asking?
-
Equity is passed to a 404 page, which does not exist, therefore that equity is lost.
-
Thanks, Bryan. This doesn't really answer the exact question, though: is link equity still passed (and domain authority preserved) by broken links producing 404 Error Pages?
-
No they don't. Search engine spiders follow the link as a user, if the pages no longer exist and you cannot forward the user to a better page then create a good 404 page that will keep the users intrigued.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Home page vs inner page?
do you believe that the advantage of targeting a search term on the home page is now worse off than before? as I understand it ctr is a big factor now And as far as i can see if two pages are equal on page etc the better ctr will win out, the issue with the home page is the serp stars cannot be used hence the ctr on a product page will be higher? I feel if you where able to get a home page up quicker (1 year instead of two) you still lost out in the end due to the product page winning on ctr? do you think this is correct?
Intermediate & Advanced SEO | | BobAnderson0 -
Url structure on product pages - Should we apply canonicalized links in breadcrumbs or entry folders
We have products in the that go into mulitiple categories on our e-commerce site. But of course, each product is only canonicalized to one category. My question is: what should the breadcrumbs look like when users access a product from a non-canonicalized/primary category ?Should we apply canonicalized links in breadcrumbs or entry folders? For example: Let´s say we have product called "glacier hiking in the alps". It is in two categories; 1) glacier hiking 2) mountain tours. And is canonicalized to the glacier hiking category. If a user accesses it from the mountain tours category, should the url/breadcrumbs look like this: www.example.com/glacier-hiking/glacier-hiking-in-the-alps (because that is the canonicalized version) Or should it look like like this: www.example.com/mountain-tours/glacier-hiking-in-the-alps (because that is where the user came from) Thanks in advance!
Intermediate & Advanced SEO | | guidetoiceland0 -
Internal link from blog content to commercial pages risks?
Hi guys, Has anyone seen cases where a site has been impacted negatively from internal linking from blog content to commercial based pages (e.g. category pages). Anchor text is natural and the links improve user experience (i.e it makes sense to add them, they're not forced). Cheers.
Intermediate & Advanced SEO | | jayoliverwright0 -
Robots.txt Disallowed Pages and Still Indexed
Alright, I am pretty sure I know the answer is "Nothing more I can do here." but I just wanted to double check. It relates to the robots.txt file and that pesky "A description for this result is not available because of this site's robots.txt". Typically people want the URL indexed and the normal Meta Description to be displayed but I don't want the link there at all. I purposefully am trying to robots that stuff outta there.
Intermediate & Advanced SEO | | DRSearchEngOpt
My question is, has anybody tried to get a page taken out of the Index and had this happen; URL still there but pesky robots.txt message for meta description? Were you able to get the URL to no longer show up or did you just live with this? Thanks folks, you are always great!0 -
Preserving link equity from old pages
Hi Moz Community, We have a lot of old pages built with Dreamweaver a long time ago (2003-2010) which sit outside our current content management system. As you'd expect they are causing a lot of trouble with SEO (Non-responsive, duplicate titles and various other issues). However, some of these older pages have very good backlinks. We were wondering what is the best way to get rid of the old pages without losing link equity? In an ideal world we would want to bring over all these old pages to our CMS, but this isn't possible due to the amount of pages (~20,000 pages) and cost involved. One option is obviously to bulk 301 redirect all these old pages to our homepage, but from what we understand that may not lead to the link equity being passed down optimally by Google (or none being passed at all). Another option we can think of would be to bring over the old articles with the highest value links onto the current CMS and 301 redirect the rest to the homepage. Any advice/thoughts will be greatly appreciated. Thumbs up! Thanks,
Intermediate & Advanced SEO | | 3gcouk0 -
Does a non-canonical URL pass link juice?
Our site received a great link from URL A, which was syndicated to URL B. But URL B is canonicalized to URL A. Does the link on URL B pass juice to my site? (See image below for a visual representation of my question) zgbzqBy
Intermediate & Advanced SEO | | Choice1 -
Site migration - 301 or 404 for pages no longer needed?
Hi I am migrating from my old website to a new one on a different, server with a very different domain and url structure. I know it's is best to change as little as possible but I just wasn't able to do that. Many of my pages can be redirected to new urls with similar or the same content. My old site has around 400 pages. Many of these pages/urls are no longer required on the new site - should I 404 these pages or 301 them to the homepage? I have looked through a lot of info online to work this out but cant seem to find a definative answer. Thanks for this!! James
Intermediate & Advanced SEO | | Curran0 -
Old pages still in index
Hi Guys, I've been working on a E-commerce site for a while now. Let me sum it up : February new site is launched Due to lack of resources we started 301's of old url's in March Added rel=canonical end of May because of huge index numbers (developers forgot!!) Added noindex and robots.txt on at least 1000 urls. Index numbers went down from 105.000 tot 55.000 for now, see screenshot (actual number in sitemap is 13.000) Now when i do site:domain.com there are still old url's in the index while there is a 301 on the url since March! I know this can take a while but I wonder how I can speed this up or am doing something wrong. Hope anyone can help because I simply don't know how the old url's can still be in the index. 4cArHPH.png
Intermediate & Advanced SEO | | ssiebn70