Do 404 pages pass link juice? And best practices...
-
Last year Google said bad links to 404 pages wouldn't hurt your site. Could that still be the case in light of recent Google updates to try and combat spammy links and negative SEO?
Can links to 404 pages benefit a website and pass link juice? I'd assume at the very least that any link juice will pass through links FROM the 404 page?
Many websites have great 404 pages that get linked to: http://www.opensiteexplorer.org/links?site=http%3A%2F%2Fretardzone.com%2F404 - that was the first of four I checked from the "60 Really Cool...404 Pages" that actually returned the 404 HTTP Status! So apologies if you find the word 'retard' offensive. According to Open Site Explorer it has a decent Page Authority and number of backlinks - but it doesn't show in Google's SERPs.
I'd never do it, but if you have a particularly well-linked to 404 page, is there an argument for giving it 200 OK Status?
Finally, what are the best practices regarding 404s and address bar links?
For example, if
www.examplesite.com/3rwdfs returns a 404 error, should I make that redirect to
www.examplesite.com/404 or leave it as is?Redirecting to www.examplesite.com/404 might not be user-friendly as people won't be able to correct the URL in the address bar. But if I have a great 404 page that people link to, I don't want links going to loads of random pages do I? Is either way considered best practice?
If I did a 301 redirect I guess it would send the wrong signal to the crawlers? Should I use a 302 redirect, or even a 304 Not Modified redirect?
-
This is a fascinating question.
Regarding your question about 404 pages getting a 200 status. So obviously, Google doesn't index 404 pages, and de-indexed pages do not pass on link juice. However, like you say, some people and sites link to 404 pages and so, were these ever to go live, you'd imagine it would have some sort of strength/authority.
But how could you practically accomplish this? If you make the 404 page a 200 page, you've now got no 404 page for your website, which could be very bad indeed. So, you'd probably want to substitute that page with a new, fresh 404 page. But if that sits as the 404 page and gets marked as a 404, wouldn't the links become void again?
If you then moved the old 404 to a new page, it loses the links once pointing to it.
The hongkiat webpage is a really clever idea as it takes all those pages and makes a shareable hub, which of course then gets all the links and strength.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
VTEX Infinite Scroll Design: What is On-Page SEO Best Practice?
We are migrating to the VTEX E Commerce platform and it is built on javascript, so there are no <a>tags to link product pages together when there is a long list of products. According to the Google Search Console Help document, "Google can follow links only if they are an</a> <a>tag with an href attribute." - Google Search Console Help document </a>http://support.google.com/webmasters/answer/9112205. So, if there a 1000 products, javascript just executes to deliver more content in order to browse through the entire product list. The problem is there is no actual link for crawlers to follow. Has anyone implemented a solution to this or a similar problem?
Intermediate & Advanced SEO | | ggarciabisco0 -
Whats the best practice for internal links?
Hi our site is set up typically for a key product (money page) with 6 to 12 cluster pages, with a few more associated blog pages. If for example the key product was "funeral plans" what percentage of the internal anchor text links should be an exact match? Will the prominence of those links eg higher up the page have an impact on the amount of juice flowing? And do links in buttons count in the same way as on page anchor text eg "compare funeral plans"? Many thanks
Intermediate & Advanced SEO | | AshShep1
Ash1 -
Reuse an old juicy URL or create a new with the best practices?
I'm optimizing a site with all new URL`s, categories, titles, descriptions. All URL's will change but I've old URLs with a lot of backlinks, SEO juice. What is better for SEO with them: 1 - Change those URLs and 301 redirect traffic to the new page.
Intermediate & Advanced SEO | | Tiedemann_Anselm
2 - Keep the URL and work just on new title, description, etc. In option 1 I understand that I'll lose some SEO juice because of the redirect, but the new URL will be correct. In option 2 everything will be strong except from the URL that will make less sense than with option 1. It will not exactly match the product name, title. It`s a reuse of a strong URL.0 -
What's the best way to check Google search results for all pages NOT linking to a domain?
I need to do a bit of link reclamation for some brand terms. From the little bit of searching I've done, there appear to be several thousand pages that meet the criteria, but I can already tell it's going to be impossible or extremely inefficient to save them all manually. Ideally, I need an exported list of all the pages mentioning brand terms not linking to my domain, and then I'll import them into BuzzStream for a link campaign. Anybody have any ideas about how to do that? Thanks! Jon
Intermediate & Advanced SEO | | JonMorrow0 -
What things, that we might overlook, help retain link juice on the site?
Since subscribing to Moz, I have been focussing alot on some of the more technical aspects of SEO. The current thing I am finding interesting is stopping link juice leaks. Here are a selection of some of the things I have done: I have cloaked my affiliate links - see http://yoast.com/cloak-affiliate-links/ Removed some html coded social share links within the theme, and replaced with javascript plugin (http://wordpress.org/plugins/flare/) Used the Moz toolbar to view as Google, to see what google is seeing. Removed some meta links at the bottom of blog posts (author etc) that were duplicated. Now, I don't intend to go over the top with this, as links to social accounts on each page are there to encourage engagement etc, but are there any things you may have come across \ tips that people may have overlooked but perhaps should look out for? As example as some of the things that might be interesting to discuss: Are too many tags, categories bad? Do you index your tag, date archive pages? Does it matter?
Intermediate & Advanced SEO | | Jonathan19790 -
How many links home on a page?
We are planning on a mega menu which will have around 300 links and a mega slider which will have around 175 links if our developer has their way. In all I could be looking at over 500 links from the home page. The Mega Menu will flatten the site link structure out but I am worried this slider on the home page which is our 4th most visited page behind our 3 core category pages. What are your thoughts?
Intermediate & Advanced SEO | | robertrRSwalters0 -
Best practice for removing indexed internal search pages from Google?
Hi Mozzers I know that it’s best practice to block Google from indexing internal search pages, but what’s best practice when “the damage is done”? I have a project where a substantial part of our visitors and income lands on an internal search page, because Google has indexed them (about 3 %). I would like to block Google from indexing the search pages via the meta noindex,follow tag because: Google Guidelines: “Use robots.txt to prevent crawling of search results pages or other auto-generated pages that don't add much value for users coming from search engines.” http://support.google.com/webmasters/bin/answer.py?hl=en&answer=35769 Bad user experience The search pages are (probably) stealing rankings from our real landing pages Webmaster Notification: “Googlebot found an extremely high number of URLs on your site” with links to our internal search results I want to use the meta tag to keep the link juice flowing. Do you recommend using the robots.txt instead? If yes, why? Should we just go dark on the internal search pages, or how shall we proceed with blocking them? I’m looking forward to your answer! Edit: Google have currently indexed several million of our internal search pages.
Intermediate & Advanced SEO | | HrThomsen0 -
To land page or not to land page
Hey all, I wish to increase my sites rankings on a variety of keywords within sub categories but I'm unsure where to be spending the time in SEO. Here's an example of the website page structure: General Home Page > Sub Category 1 Home Page
Intermediate & Advanced SEO | | DPSSeomonkey
> Searching / Results pages
- Sub Category 1
- Sub Category 2
- Sub Category 3
- Sub Category 4 > Sub Category 2 Home Page
> Searching / Results pages
- Sub Category 1
- Sub Category 2
- Sub Category 3
- Sub Category 4 We've newly introduced the Sub Category Home Pages and I was wondering if SEO is best performed on these pages or should landing pages be built, one for each of the 4 sub categories in each section. Those landing pages would have links to the "Searching / Results pages" for that sub category. Thanks!0