Do 404 pages pass link juice? And best practices...
-
Last year Google said bad links to 404 pages wouldn't hurt your site. Could that still be the case in light of recent Google updates to try and combat spammy links and negative SEO?
Can links to 404 pages benefit a website and pass link juice? I'd assume at the very least that any link juice will pass through links FROM the 404 page?
Many websites have great 404 pages that get linked to: http://www.opensiteexplorer.org/links?site=http%3A%2F%2Fretardzone.com%2F404 - that was the first of four I checked from the "60 Really Cool...404 Pages" that actually returned the 404 HTTP Status! So apologies if you find the word 'retard' offensive. According to Open Site Explorer it has a decent Page Authority and number of backlinks - but it doesn't show in Google's SERPs.
I'd never do it, but if you have a particularly well-linked to 404 page, is there an argument for giving it 200 OK Status?
Finally, what are the best practices regarding 404s and address bar links?
For example, if
www.examplesite.com/3rwdfs returns a 404 error, should I make that redirect to
www.examplesite.com/404 or leave it as is?Redirecting to www.examplesite.com/404 might not be user-friendly as people won't be able to correct the URL in the address bar. But if I have a great 404 page that people link to, I don't want links going to loads of random pages do I? Is either way considered best practice?
If I did a 301 redirect I guess it would send the wrong signal to the crawlers? Should I use a 302 redirect, or even a 304 Not Modified redirect?
-
This is a fascinating question.
Regarding your question about 404 pages getting a 200 status. So obviously, Google doesn't index 404 pages, and de-indexed pages do not pass on link juice. However, like you say, some people and sites link to 404 pages and so, were these ever to go live, you'd imagine it would have some sort of strength/authority.
But how could you practically accomplish this? If you make the 404 page a 200 page, you've now got no 404 page for your website, which could be very bad indeed. So, you'd probably want to substitute that page with a new, fresh 404 page. But if that sits as the 404 page and gets marked as a 404, wouldn't the links become void again?
If you then moved the old 404 to a new page, it loses the links once pointing to it.
The hongkiat webpage is a really clever idea as it takes all those pages and makes a shareable hub, which of course then gets all the links and strength.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Which is the best option for these pages?
Hi Guys, We have product pages on our site which have duplicate content, the search volume for people searching for these products is very, very small. Also if we add unique content, we could face keyword cannibalisation issues with category/sub-category pages. Now based on proper SEO best practice we should add rel canonical tags from these product pages to the next relevant page. Pros Can rank for product oriented keywords but search volume is very small. Any link equity to these pages passed due to the rel canonical tag would be very small, as these pages barely get any links. Cons Time and effort involved in adding rel canonical tags. Even if we do add rel canonical tags, if Google doesn't deem them relevant then they might ignore causing duplicate content issues. Time and effort involved in making all the content unique - not really worth it - again very minimal searchers. Plus if we do make it unique, then we face keyword cannibalisation issues. -- What do you think would be the optimal solution to this? I'm thinking just implementing a: Across all these product based pages. Keen to hear thoughts? Cheers.
Intermediate & Advanced SEO | | seowork2140 -
Does Navigation Bar have an effect on the link juice and the number of internal links?
Hi Moz community, I am getting the "Avoid Too Many Internal Links" error from Moz for most of my pages and Google declared the max number as 100 internal links. However, most of my pages can't have internal links less than 100, since it is a commercial website and there are many categories that I have to show to my visitors by using the drop down navigation bar. Without counting the links in the navigation bar, the number of internal links is below 100. I am wondering if the navigation bar links affect the link juice and counted as internal links by Google. The Same question also applies to the links in the footer. Additionally, how about the products? I have hundreds of products in the category pages and even though I use pagination I still have many links in the category pages (probably more than 100 without even counting the navigation bar links). Does Google count the product links as internal links and how about the effect on the link juice? Here is the website if you want to take a look: http://www.goldstore.com.tr Thank you for your answers.
Intermediate & Advanced SEO | | onurcan-ikiz0 -
Links on page
Hi I have a web page which lists about 50-60 products which links out to either a pdf on the product or the main manufacturers website page containing product detail. The site in non e-commerce is this the site/page likely to get hit by Penguin? Would it be best to create a separate page for the product/manufacturer group i.e 5 or 6 pages but linking out to the PDFs etc...?
Intermediate & Advanced SEO | | Cocoonfxmedia0 -
Robots.txt - blocking JavaScript and CSS, best practice for Magento
Hi Mozzers, I'm looking for some feedback regarding best practices for setting up Robots.txt file in Magento. I'm concerned we are blocking bots from crawling essential information for page rank. My main concern comes with blocking JavaScript and CSS, are you supposed to block JavaScript and CSS or not? You can view our robots.txt file here Thanks, Blake
Intermediate & Advanced SEO | | LeapOfBelief0 -
Does link juice pass along the URL or the folders? 10yr old PR 6 site
We have a website that is ~10yrs old and a PR 6. It has a bunch of legitimate links from .edu and .gov sites. Until now the owner has never blogged or added much content to the site. We have suggested that to grow his traffic organically he should add a worpress blog and get agressive with his content. The IT guy is concerned about putting a wordpress blog on the same server as the main site because of security issues with WP. They have a bunch of credit card info on file. So, would it be better to just put the blog on a subdomain like blog.mysite.com OR host the blog on another server but have the URL structure be mysite.com/blog? I have tried to pass as much juice as possible. Any ideas?
Intermediate & Advanced SEO | | jasonsixtwo0 -
Moving some content to a new domain - best practices to avoid duplicate content?
Hi We are setting up a new domain to focus on a specific product and want to use some of the content from the original domain on the new site and remove it from the original. The content is appropriate for the new domain and will be irrelevant for the original domain and we want to avoid creating completely new content. There will be a link between the two domains. What is the best practice for this to avoid duplicate content and a potential Panda penalty?
Intermediate & Advanced SEO | | Citybase0 -
Too many links on home page?
I run a forum that currently has 351 links on its home page (and p.1, 2 etc.) due to all the tags that appear under feed items. Is it a viable solution to get this number down to simply nofollow all of these tag links? How else can I get this number of links down recorded by Google down, or is it not something I should be especially worried about?
Intermediate & Advanced SEO | | staingurus0 -
Use rel=canonical to save otherwise squandered link juice?
Oftentimes my site has content which I'm not really interested in having included in search engine results. Examples might be a "view cart" or "checkout" page, or old products in the catalog that are no longer available in our system. In the past, I'd blocked those pages from being indexed by using robots.txt or nofollowed links. However, it seems like there is potential link juice that's being lost by removing these from search engine indexes. What if, instead of keeping these pages out of the index completely, I use to reference the home page (http://www.mydomain.com) of the business? That way, even if the pages I don't care about accumulate a few links around the Internet, I'll be capturing the link juice behind the scenes without impacting the customer experience as they browse our site. Is there any downside of doing this, or am I missing any potential reasons why this wouldn't work as expected?
Intermediate & Advanced SEO | | cadenzajon1