404 Best Practices
-
Hello All,
So about 2 months ago, there was a massive spike in the number of crawl errors on my site according to Google Webmaster tools.
I handled this by sending my webmaster a list of the broken pages with working pages that they should 301 redirect to.
Admittedly, when I looked back a couple weeks later, the number had gone down only slightly, so I sent another list to him (I didn't realize that you could 'Mark as fixed' in webmaster tools)
So when I sent him more, he 301 redirected them again (with many duplicates) as he was told without really digging any deeper.
Today, when I talked about more re-directs, he suggested that 404's do have a place, that if they are actually pages that don't exist anymore, then a ton of 301 re-directs may not be the answer.
So my two questions are:
1. Should I continue to relentlessly try to get rid of all 404's on my site, and if so, do I have to be careful not to be lazy and just send most of them to the homepage.
2. Are there any tools or really effective ways to remove duplicate 301 redirect records on my .htaccess (because the size of it at this point could very well be slowing down my site).
Any help would be appreciated, thanks
-
Thanks for the question.
Just to add to what the other guys have said, here is a helpful article from Google which explains a bit more about their stance on 404s.
In general, you should try to 301 redirect 404 errors to other pages if it makes sense for the user. For example, if a 404 page is an article about belgian beers, you could redirect that to another article about belgian beers so that the user lands on a relevant page. You should definitely try to avoid mass redirecting links to your homepage because Google can treat these as 404s - there is a video here on the topic:
http://www.davidsottimano.com/internal-301-homepage-treated-404-google/
If a page 404s and can't be redirected to a relevant page, then it's best to leave it as a 404 and let Google keep crawling it.
I also agree with the other guys on creating a custom 404 page which will provide a good user experience if they happen to land on it.
-
As far as the Google Webmaster is concern it usually take a bit of time to get updated especially when it comes to 404 pages so soul trust on this one tool will be dangerous. I prefer using screaming frog to see the current situation of the website as their audit is based on the current website condition.
I do agree with Jesse that 404s do have a place and they can exist but the problem starts when any of the 404 page either contain good amount of link juice or traffic landing on those pages. In order to save your link juice and positional traffic you have to land them to a page that actually exist and can entertain your traffic well so that is why we recommend using 301 to page that can entertain will and full fill user’s expectation.
I will also suggest you to create a custom 404 page that helps people move to different areas of the website.
Hope this helps
-
The error that is most problematic is when you have 404 pages that you are linking to internally and/or are found in your site map. Google reporting that an external site is linking to a 404 page isn't as damaging. If it truly is an error then a 404 is okay to have, but not preferable. If the page linking to you made an error, it would be preferable to seek out the owner of that page and ask them to fix it. However, if you can't reach the owner and if the referring website is a valuable traffic source and/or that link juice is important then, yes, add a 301 redirect.
-
It's true that 404s have a place and can exist just fine so long as people aren't landing on them. Think of it from a usability standpoint; if the link exists somewhere on a foreign domain and is pointing to your site, you should 301 it. That way if somebody clicks it they don't get dropped into 404-hell.
Otherwise don't worry about it. Google will eventually stop looking at pages that 404. They're only notifying you in case it was a mistake and you want to capture a potential audience that is coming through that link.
Best practice is to 301 broken links if they receive traffic or have some link juice you aren't interested in losing. Also be sure and create a custom 404 page for any that you may miss out there so that at least the person arrives on a page branded by you with easily accessible links to find what they might be looking for.
Not sure about multiple 301s in an htaccess file (I'm more accustomed to an IIS server) but my guess would be that it shouldn't matter. They aren't pointing to different places and crossing back to eachother so you are in no danger of a redirect loop. I would imagine the spiders will follow one of them and ignore the others.
Hope this helps and good luck!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate content and 404 errors
I apologize in advance, but I am an SEO novice and my understanding of code is very limited. Moz has issued a lot (several hundred) of duplicate content and 404 error flags on the ecommerce site my company takes care of. For the duplicate content, some of the pages it says are duplicates don't even seem similar to me. additionally, a lot of them are static pages we embed images of size charts that we use as popups on item pages. it says these issues are high priority but how bad is this? Is this just an issue because if a page has similar content the engine spider won't know which one to index? also, what is the best way to handle these urls bringing back 404 errors? I should probably have a developer look at these issues but I wanted to ask the extremely knowledgeable Moz community before I do 🙂
Technical SEO | | AliMac260 -
What is the best way to stop a page being indexed?
What is the best way to stop a page being indexed? Is it to implement robots.txt at a site level with a Robots.txt file in the main directory or at a page level with the tag?
Technical SEO | | cbarron0 -
404 from a 404 that 301s
I must be missing something or skipping a step or lacking proper levels of caffeine. Under my High Priority warnings I have a handful of 404s which are like that on purpose but I'm not sure how Moz is finding them. When I check the referrer info, the 404 is being linked to from a different 404 which is now a 301 (due to craziness of our system and what was easiest for the coders to fix a different problem ages ago). Basically, if a user decides to type in a non-existent model number into the URL there is a specific 404 that comes up. While the 404 error is "site.com/product/?model=abc123" the referrer is "site.com/product?model=abc123" (or more simply, one slash is missing). I can't see how Moz is finding the referrer so I can't figure out how to make Moz stop crawling it. I actually have the same problem in Google WMT for the same group of 404s. What am I just not seeing that will fix this?
Technical SEO | | MikeRoberts0 -
Industry News Page Best Practices
Hi, We have created an industry news page which automatically curates articles from specific news sources within our sector. Currently, I have the news index page set to be indexed and followed by robots. I have the article pages noindex, nofollow, since these are not original content. Is this the best practice or do you recommend another configuration? Thanks!
Technical SEO | | JoshGFialkoff0 -
What is the best URL designed for a product page?
Should a product page URL include the category name and subcategory name in it? Most ecommerce platforms it seems are designed to do have the category and sub-category names included in the URL followed by the product name. If that is the case and the same product is listed in more then 1 category and sub-category then will that product have 2 unique urls and as a result be treated as 2 different product pages by google? And then since it is the same product in two places on the site won't google treat those 2 pages as having duplicate content? SO is it best to not have the category and sub-category names in the URL of a product page? And lastly, is there a preferred character limit for a URL to be less than in size? Thanks!
Technical SEO | | gallreddy0 -
Best free tool to check internal broken links
Question says it all I guess. What would your recommend as the best free tool to check internal broken links?
Technical SEO | | RikkiD225 -
Multi- language URL best practices
we have two different content perlanguage (Fr. EN )) they are not Duplicated and they are completly different. what is better for the URL a language sub domain or a folder fr.mycompany.com or mycompany.com/fr/
Technical SEO | | omarfk0 -
Do sites really need a 404 page?
We have people posting broken links to our site is this looking us link juice as they link to 404 pages. We could redirect to the homepage or just render the home page content, in both cases we can still display a clear page not found message. Is this legal (white hat).
Technical SEO | | ed1234560