Do 404 Pages from Broken Links Still Pass Link Equity?
-
Hi everyone, I've searched the Q&A section, and also Google, for about the past hour and couldn't find a clear answer on this.
When inbound links point to a page that no longer exists, thus producing a 404 Error Page, is link equity/domain authority lost?
We are migrating a large eCommerce website and have hundreds of pages with little to no traffic that have legacy 301 redirects pointing to their URLs. I'm trying to decide how necessary it is to keep these redirects. I'm not concerned about the page authority of the pages with little traffic...I'm concerned about overall domain authority of the site since that certainly plays a role in how the site ranks overall in Google (especially pages with no links pointing to them...perfect example is Amazon...thousands of pages with no external links that rank #1 in Google for their product name).
Anyone have a clear answer? Thanks!
-
First off, thanks everyone for your replies
I'm well versed in best practices of 301 redirects, sitemaps, etc, etc. In other words, I fully know the optimal way to handle this. But, this is one of those situations where there are so many redirects involved (thousands) for a large site, that I want to make sure that what we are doing is fully worth the development time.
We are migrating a large website that was already migrated to a different CMS several years ago. There are thousands of legacy 301 redirects already in place for the current site, and many of those pages that are being REDIRECTED TO (from the old URL versions) receive very little/if any traffic. We need to decide if the work of redirecting them is worth it.
I'm not as worried about broken links for pages that don't get any traffic (although we ideally want 0 broken links). What I am most worried about, however, is losing domain authority and the whole site potentially ranking a little bit lower overall as a result.
Nakul's response (and Frederico's) are closest to what I am asking...but everyone is suggesting the same thing...that we will lose domain authority (example measurement: SEOmoz's OpenSiteExplorer domain authority score) if we don't keep those redirects in place (but of course, avoiding double redirects).
So, thanks again to everyone on this thread
If anyone has a differing opinion, I'd love to hear it...but this is pretty much what I expected: everyone's best educated assessment is that you will lose domain authority when 301 redirects are lifted and broken links are the end result.
-
Great question Dan. @Jesse, you are on the right track. I think the question was misunderstood.
The question is, if seomoz.org links to Amazon.com/nakulgoyal and that page does not exist, is there link juice flow ? Think about it. It's like thinking about a citation. If seomoz.org mentions amazon.com/nakulgoyal, but does not actually have the hyperlink, is there citation flow.
So my question to the folks is, is there citation flow ? In my opinion, the answer is yes. There's some DA that will get passed along. Eventually, the site owner might identify the 404, "which they should" and setup a 301 redirect from Amazon.com/nakulgoyal to whatever pages makes most sense for the user, in which case there will be a proper link juice flow.
So to clarify what I said:
-
Scenario 1:
SiteA.com links to SiteB.com/urldoesnotexist - There is some (maybe close to negligible) domain authority flow. from siteA.com to siteB.com (Sort of like a link citation). There may not be a proper link juice flow, because the link is broken. -
Scenario 2:
SiteA.com links to SiteB.com/urldoesnotexist and this URL is 301 redirected SiteB.com/urlexists - In this case, there is both a authority flow and a link juice flow from SiteA.com to SiteB.com/urlexists
**That's my opinion. Think about it, the 301 redirect from /urldoesnotexist to /urlexists might get added 1 year from now and might be mistakenly removed at some point temporarily. There's going to be an affect in both cases. So in my opinion, the crux is, watch your 404's and redirect them when you and when it makes sense for the user. That way you have a good user experience and you can have the link juice flow where it should. **
-
-
Ideally you want to keep the number of 404 pages low because it tells the search engine that the page is a dead end, ask any SEO, it's best to keep the number of 404's as low as possible.
Link equity tells Google why to rank a page or give the root domain more authority. However, Google does not want users to end up on dead pages. So it will not help the site, rather hurt it. My recommendation is to create a sitemap and submit to Google WMT with the pages you want the spiders to index.
Limit the 404's as much as possible and try to 301 them if possible to a relevant page (from a user perspective).
-
I think, and correct me if I'm wrong Dan, you guys are misunderstanding the question.
He means that if you do actually create a 404 page for all your broken links to land on, will the juice pass from there to your domain (housing the 404 page) and on to whatever internal links you've built into said 404 page.
The answer, I think, is no. Reason for this is 404 is a status code returned before the 404 page is produced. Link juice can pass through either links (200) or redirects (301).
Again... I THINK.
Was this more what you were asking?
-
Equity is passed to a 404 page, which does not exist, therefore that equity is lost.
-
Thanks, Bryan. This doesn't really answer the exact question, though: is link equity still passed (and domain authority preserved) by broken links producing 404 Error Pages?
-
No they don't. Search engine spiders follow the link as a user, if the pages no longer exist and you cannot forward the user to a better page then create a good 404 page that will keep the users intrigued.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Landing pages, are my pages competing?
If I have identified a keyword which generates income and when searched in google my homepage comes up ranked second, should I still create a landing page based on that keyword or will it compete with my homepage and cause it to rank lower?
Intermediate & Advanced SEO | | The_Great_Projects0 -
How to do Spam Link Analysis before posting a link?
OSE provides Spam analysis for website link profile, Do Moz have a tool to check the link quality before placing a link? How to do Spam Link Analysis before posting a link?
Intermediate & Advanced SEO | | bondhoward1 -
Should we show(to google) different city pages on our website which look like home page as one page or different? If yes then how?
On our website, we show events from different cities. We have made different URL's for each city like www.townscript.com/mumbai, www.townscript.com/delhi. But the page of all the cities looks similar, only the events change on those different city pages. Even our home URL www.townscript.com, shows the visitor the city which he visited last time on our website(initially we show everyone Mumbai, visitor needs to choose his city then) For every page visit, we save the last visited page of a particular IP address and next time when he visits our website www.townscript.com, we show him that city only which he visited last time. Now, we feel as the content of home page, and city pages is similar. Should we show these pages as one page i.e. Townscript.com to Google? Can we do that by rel="canonical" ? Please help me! As I think all of these pages are competing with each other.
Intermediate & Advanced SEO | | sanchitmalik0 -
Disavow Links & Paid Link Removal (discussion)
Hey everyone, We've been talking about this issue a bit over the last week in our office, I wanted to extend the idea out to the Moz community and see if anyone has some additional perspective on the issue. Let me break-down the scenario: We're in the process of cleaning-up the link profile for a new client, which contains many low quality SEO-directory links placed by a previous vendor. Recently, we made a connection to a webmaster who controls a huge directory network. This person found 100+ links to our client's site on their network and wants $5/link to have them removed. Client was not hit with a manual penalty, this clean-up could be considered proactive, but an algorithmic 'penalty' is suspected based on historical keyword rankings. **The Issue: **We can pay this ninja $800+ to have him/her remove the links from his directory network, and hope it does the trick. When talking about scaling this tactic, we run into some ridiculously high numbers when you talk about providing this service to multiple clients. **The Silver Lining: **Disavow Links file. I'm curious what the effectiveness of creating this around the 100+ directory links could be, especially since the client hasn't been slapped with a manual penalty. The Debate: Is putting a disavow file together a better alternative to paying for crappy links to be removed? Are we actually solving the bad link problem by disavowing or just patching it? Would choosing not to pay ridiculous fees and submitting a disavow file for these links be considered a "good faith effort" in Google's eyes (especially considering there has been no manual penalty assessed)?
Intermediate & Advanced SEO | | Etna0 -
404 Error on Blog Pages that Look Like Loading Fine
There was recently a huge increase in 404 errors on Yandex Webmasters corresponding with a drop in rankings. Most of the pages seem to be from my blog (which was updated around the same time). When I click on the links from Yandex the page looks like it is loading normal, expect that it has the following message from the Facebook plugin I am using for commenting Any ideas about what the problem is or how to fix it? Critical Errors That Must Be Fixed | Bad Response Code: | URL returned a bad HTTP response code. | Open Graph Warnings That Should Be Fixed | Inferred Property: | The 'og:url' property should be explicitly provided, even if a value can be inferred from other tags. |
Intermediate & Advanced SEO | | theLotter
| Inferred Property: | The 'og:title' property should be explicitly provided, even if a value can be inferred from other tags. |
| Small og:image: | All the images referenced by og:image should be at least 200px in both dimensions. Please check all the images with tag og:image in the given url and ensure that it meets the recommended specification. |0 -
Do nofollow links affect link profile?
I've read that it's good to keep a natural link profile. Some naked links, some links going to our company name, some with anchor text, etc. Do nofollow links affect this link profile, or is it only followed links that are taken into account?
Intermediate & Advanced SEO | | lighttable0 -
Link Building for "State" informational pages
I have a webpage for all 50 states for specific info relating to relocation and was wondering if there are any recommended links to work at getting for these pages. I would like to do "state" specific and possibly health related links for each page to help in the SEO rankings. I can see that if I just wanted to get 10 links on each page that is going to be 500 links I have to build and it is going to be very time consuming but I feel it is necessary. Thank you, Poo
Intermediate & Advanced SEO | | Boodreaux0 -
Do links script tags pass value?
Hi I was wondering if there was any consensus over whether links in script tags pass any value - such as the link code below? Thanks
Intermediate & Advanced SEO | | James770