Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Do 404s really 'lose' link juice?
-
It doesn't make sense to me that a 404 causes a loss in link juice, although that is what I've read. What if you have a page that is legitimate -- think of a merchant oriented page where you sell an item for a given merchant --, and then the merchant closes his doors. It makes little sense 5 years later to still have their merchant page so why would removing them from your site in any way hurt your site? I could redirect forever but that makes little sense. What makes sense to me is keeping the page for a while with an explanation and options for 'similar' products, and then eventually putting in a 404. I would think the eventual dropping out of the index actually REDUCES the overall link juice (ie less pages), so there is no harm in using a 404 in this way. It also is a way to avoid the site just getting bigger and bigger and having more and more 'bad' user experiences over time.
Am I looking at it wrong?
ps I've included this in 'link building' because it is related in a sense -- link 'paring'.
-
Thanks Amelia!
-
I may be being pedantic here but I think the correct status code should be 410 not 404 if the page has gone for good and you don't have a relevant place to redirect traffic to, as per the scenario described.
I believe if Google finds a 410 page, it'll be removed from the index but because 404 is 'file not found' the page may stay in the index, potentially giving bad user experience as outlined by Matt Williamson.
However, I would always redirect if you can - even if you just send traffic to the homepage, it's got to be a better user experience than sending them to a 404 page. I think anyway!
More info here: http://moz.com/learn/seo/http-status-codes
You mention a concern over too many redirects - I think this page may help eliminate your fears: http://www.stateofdigital.com/matt-cutts-there-is-no-limit-to-direct-301-redirects-there-is-on-chains/
Thanks,
Amelia
-
Matt, thanks.. Good points for sure. My concern is that since something like 50% of new businesses close doors within 5 years, so the list of redirected urls will just keep getting bigger over time..Is that a concern? I guess over time less people will link to the defunct businesses, but I will still have to track them..maybe at some point when the number of links to them is small it would make sense to then 404 them? Of course, I'd still need to track which ones to 404, so i'm now wondering when 404 ever makes sense on prior legitimate pages..
Just to be clear -- redirecting does remove the old link from the index, right?
-
404's can loose link juice and cause the most issues when a page that had lots of link pointing to it passing authority becomes a 404 page. As this page no longer exists the authority that was being passed to it from the links that were pointing at it will be lost when Google eventually de-indexes the page. You also must remember that this page is likely to be in Google Index and if people click on it and it is not found they are more likely to bounce from your site. You will also loose what terms this page was ranking for when it is eventually de-indexed as well. Redirecting this page to its new location or a similar/relevant page will help keep most of this authority that has been earnt helping with your ranking and keeping human visitors happy.
You also need to think of this from a crawl point of view - lots of 404s doesn't make your site very friendly as Googlebot is wasting time trying to crawl pages that don't exist. Ultimately making sure you don't have 404 pages and keep on top of redirecting these is important particularly if the page had authority. A great big hint to the importance is the fact that Google reports these crawl issues in Google Webmaster Tools in order for you to be able to monitor and fix them.
On a side note I have seen cases where sites have had a lot of 404s due to a significant change of URL structure and they haven't done any redirects - they have lost the majority of their organic rankings and traffic!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Breadcrumbs and internal links
Hello, I use to move up my site structure with links in content. I have now installed breadcrumbs, is it is useful to still keep the links in content or isn't there a need to duplicate those links ? and are the breadcrumbs links enough. Thank you,
Intermediate & Advanced SEO | | seoanalytics1 -
Will disallowing URL's in the robots.txt file stop those URL's being indexed by Google
I found a lot of duplicate title tags showing in Google Webmaster Tools. When I visited the URL's that these duplicates belonged to, I found that they were just images from a gallery that we didn't particularly want Google to index. There is no benefit to the end user in these image pages being indexed in Google. Our developer has told us that these urls are created by a module and are not "real" pages in the CMS. They would like to add the following to our robots.txt file Disallow: /catalog/product/gallery/ QUESTION: If the these pages are already indexed by Google, will this adjustment to the robots.txt file help to remove the pages from the index? We don't want these pages to be found.
Intermediate & Advanced SEO | | andyheath0 -
What is best practice for "Sorting" URLs to prevent indexing and for best link juice ?
We are now introducing 5 links in all our category pages for different sorting options of category listings.
Intermediate & Advanced SEO | | lcourse
The site has about 100.000 pages and with this change the number of URLs may go up to over 350.000 pages.
Until now google is indexing well our site but I would like to prevent the "sorting URLS" leading to less complete crawling of our core pages, especially since we are planning further huge expansion of pages soon. Apart from blocking the paramter in the search console (which did not really work well for me in the past to prevent indexing) what do you suggest to minimize indexing of these URLs also taking into consideration link juice optimization? On a technical level the sorting is implemented in a way that the whole page is reloaded, for which may be better options as well.0 -
Sponsored blog - pass any link juice?
Hello there, If a quality blog in our specific niche writes an article about us which is clearly labelled "sponsored post" as we have either paid them or given them a product, will Google discount that link going back to our website? Should we request for the link to be "no-follow"? Thanks Robert
Intermediate & Advanced SEO | | roberthseo0 -
Linking to URLs With Hash (#) in Them
How does link juice flow when linking to URLs with the hash tag in them? If I link to this page, which generates a pop-over on my homepage that gives info about my special offer, where will the link juice go to? homepage.com/#specialoffer Will the link juice go to the homepage? Will it go nowhere? Will it go to the hash URL above? I'd like to publish an annual/evergreen sort of offer that will generate lots of links. And instead of driving those links to homepage.com/offer, I was hoping to get that link juice to flow to the homepage, or maybe even a product page, instead. And just updating the pop over information each year as the offer changes. I've seen competitors do it this way but wanted to see what the community here things in terms of linking to URLs with the hash tag in them. Can also be a use case for using hash tags in URLs for tracking purposes maybe?
Intermediate & Advanced SEO | | MiguelSalcido0 -
Does Disavowing Links Negate Anchor Text, or Just Negates Link Juice
I'm not so sure that disavowing links also discounts the anchor texts from those links. Because nofollow links absolutely still pass anchor text values. And disavowing links is supposed to be akin to nofollowing the links. I wonder because there's a potential client I'm working on an RFP for and they have tons of spammy directory links all using keyword rich anchor texts and they lost 98% of their traffic in Pengiun 1.0 and haven't recovered. I want to know what I'm getting into. And if I just disavow those links, I'm thinking that it won't help the anchor text ratio issues. Can anyone confirm?
Intermediate & Advanced SEO | | MiguelSalcido0 -
URL Value: Menu Links vs Body Content Links
Hi All, I'm a little confused. I have read a number of articles from authority sites that give mixed signals over the importance of menu links vs body content links. It is suggested that whilst all menu links spread link juice equally, Google does not see them as favourably. Inserting a link within the body will add more link juice value to the desired page. Any thoughts would be appreciated. Thanks Mark
Intermediate & Advanced SEO | | Mark_Ch0 -
Is there any negative SEO effect of having comma's in URL's?
Hello, I have a client who has a large ecommerce website. Some category names have been created with comma's in - which has meant that their software has automatically generated URL's with comma's in for every page that comes beneath the category in the site hierarchy. eg. 1 : http://shop.deliaonline.com/store/music,-dvd-and-games/dvds-and-blu_rays/ eg. 2 : http://shop.deliaonline.com/store/music,-dvd-and-games/dvds-and-blu_rays/action-and-adventure/ etc... I know that URL's with comma's in look a bit ugly! But is there 'any' SEO reason why URL's with comma's in are any less effective? Kind Regs, RB
Intermediate & Advanced SEO | | RichBestSEO0