404 Best Practices
-
Hello All,
So about 2 months ago, there was a massive spike in the number of crawl errors on my site according to Google Webmaster tools.
I handled this by sending my webmaster a list of the broken pages with working pages that they should 301 redirect to.
Admittedly, when I looked back a couple weeks later, the number had gone down only slightly, so I sent another list to him (I didn't realize that you could 'Mark as fixed' in webmaster tools)
So when I sent him more, he 301 redirected them again (with many duplicates) as he was told without really digging any deeper.
Today, when I talked about more re-directs, he suggested that 404's do have a place, that if they are actually pages that don't exist anymore, then a ton of 301 re-directs may not be the answer.
So my two questions are:
1. Should I continue to relentlessly try to get rid of all 404's on my site, and if so, do I have to be careful not to be lazy and just send most of them to the homepage.
2. Are there any tools or really effective ways to remove duplicate 301 redirect records on my .htaccess (because the size of it at this point could very well be slowing down my site).
Any help would be appreciated, thanks
-
Thanks for the question.
Just to add to what the other guys have said, here is a helpful article from Google which explains a bit more about their stance on 404s.
In general, you should try to 301 redirect 404 errors to other pages if it makes sense for the user. For example, if a 404 page is an article about belgian beers, you could redirect that to another article about belgian beers so that the user lands on a relevant page. You should definitely try to avoid mass redirecting links to your homepage because Google can treat these as 404s - there is a video here on the topic:
http://www.davidsottimano.com/internal-301-homepage-treated-404-google/
If a page 404s and can't be redirected to a relevant page, then it's best to leave it as a 404 and let Google keep crawling it.
I also agree with the other guys on creating a custom 404 page which will provide a good user experience if they happen to land on it.
-
As far as the Google Webmaster is concern it usually take a bit of time to get updated especially when it comes to 404 pages so soul trust on this one tool will be dangerous. I prefer using screaming frog to see the current situation of the website as their audit is based on the current website condition.
I do agree with Jesse that 404s do have a place and they can exist but the problem starts when any of the 404 page either contain good amount of link juice or traffic landing on those pages. In order to save your link juice and positional traffic you have to land them to a page that actually exist and can entertain your traffic well so that is why we recommend using 301 to page that can entertain will and full fill user’s expectation.
I will also suggest you to create a custom 404 page that helps people move to different areas of the website.
Hope this helps
-
The error that is most problematic is when you have 404 pages that you are linking to internally and/or are found in your site map. Google reporting that an external site is linking to a 404 page isn't as damaging. If it truly is an error then a 404 is okay to have, but not preferable. If the page linking to you made an error, it would be preferable to seek out the owner of that page and ask them to fix it. However, if you can't reach the owner and if the referring website is a valuable traffic source and/or that link juice is important then, yes, add a 301 redirect.
-
It's true that 404s have a place and can exist just fine so long as people aren't landing on them. Think of it from a usability standpoint; if the link exists somewhere on a foreign domain and is pointing to your site, you should 301 it. That way if somebody clicks it they don't get dropped into 404-hell.
Otherwise don't worry about it. Google will eventually stop looking at pages that 404. They're only notifying you in case it was a mistake and you want to capture a potential audience that is coming through that link.
Best practice is to 301 broken links if they receive traffic or have some link juice you aren't interested in losing. Also be sure and create a custom 404 page for any that you may miss out there so that at least the person arrives on a page branded by you with easily accessible links to find what they might be looking for.
Not sure about multiple 301s in an htaccess file (I'm more accustomed to an IIS server) but my guess would be that it shouldn't matter. They aren't pointing to different places and crossing back to eachother so you are in no danger of a redirect loop. I would imagine the spiders will follow one of them and ignore the others.
Hope this helps and good luck!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Xml sitemaps giving 404 errors
We have recently made updates to our xml sitemap and have split them into child sitemaps. Once these were submitted to search console, we received notification that the all of the child sitemaps except 1 produced 404 errors. However, when we view the xml sitemaps in a browser, there are no errors. I have also attempted crawling the child sitemaps with Screaming Frog and received 404 responses there as well. My developer cannot figure out what is causing the errors and I'm hoping someone here can assist. Here is one of the child sitemaps: http://www.sermonspice.com/sitemap-countdowns_paged_1.xml
Technical SEO | | ang0 -
Webmaster Tools vs Screaming from for 404's
Hey guys, I was just wondering which is better to use to find the 404's effecting your site. I have been using webmaster tools and just purchased screaming frog which has given me a totally different list of 404's compared to WMT. Which do I use, or do I use both? Cheers
Technical SEO | | Adamshowbiz0 -
Non WWW. versus WWW. versions, current best practice ?
Hi Im increasingly seeing sites not using the www., but understand from various sources including seomoz that best practice is to be on the www. with the non www version 301'd to the www version. Since alot of sites are clearly doing this the other way round now is that better practice or the former still best ? I appreciate that non www version gives you 3 more characters for url's but apart from that is there any benefit over the www. version ? Cheers Dan
Technical SEO | | Dan-Lawrence0 -
How to set Home page for the best effect
My head is spinning with all the confusing possibilities. Does anybody have an easy answer for setting up the home page and its canonical-ishness ie Which gives the best SEO Mojo ? \ \default.aspx \keyword\ \keyword\default.aspx Thanking you in advance for reducing the number of business migranes around the globe.
Technical SEO | | blinkybill0 -
BEST Wordpress Robots.txt Sitemap Practice??
Alright, my question comes directly from this article by SEOmoz http://www.seomoz.org/learn-seo/robotstxt Yes, I have submitted the sitemap to google, bing's webmaster tools and and I want to add the location of our site's sitemaps and does it mean that I erase everything in the robots.txt right now and replace it with? <code>User-agent: * Disallow: Sitemap: http://www.example.com/none-standard-location/sitemap.xml</code> <code>???</code> because Wordpress comes with some default disallows like wp-admin, trackback, plugins. I have also read other questions. but was wondering if this is the correct way to add sitemap on Wordpress Robots.txt http://www.seomoz.org/q/robots-txt-question-2 http://www.seomoz.org/q/quick-robots-txt-check. http://www.seomoz.org/q/xml-sitemap-instruction-in-robots-txt-worth-doing I am using Multisite with Yoast plugin so I have more than one sitemap.xml to submit Do I erase everything in Robots.txt and replace it with how SEOmoz recommended? hmm that sounds not right. User-agent: *
Technical SEO | | joony2008
Disallow:
Disallow: /wp-admin
Disallow: /wp-includes
Disallow: /wp-login.php
Disallow: /wp-content/plugins
Disallow: /wp-content/cache
Disallow: /wp-content/themes
Disallow: /trackback
Disallow: /comments **ERASE EVERYTHING??? and changed it to** <code> <code>
<code>User-agent: *
Disallow: </code> Sitemap: http://www.example.com/sitemap_index.xml</code> <code>``` Sitemap: http://www.example.com/sub/sitemap_index.xml ```</code> <code>?????????</code> ```</code>0 -
What is the best way of generating links with our exchanging them
I have read that google do not look kindly at sites that exchange links so i am trying to find a way of generating links to a new site. I am building a new site and need to start to get google to index it and build important links to generate traffic. I have looked at link exchange sites but have read that this is not great with google and it is better if you have sites where there is just one way linking. I do not want to buy links and would like to find a way of generating free links which can help build up traffic and the status of my new site. Any help would be great
Technical SEO | | ClaireH-1848860 -
What are the best techniques for sub-menu?
Which techniques are "SEO-Friendly" for creating a sub-menu when you have to go hover a menu to discover the sub-menu? Best regards, Jonathan
Technical SEO | | JonathanLeplang0 -
What are the SEOmoz-suggested best practices for limiting the number of 301 redirects for a given site?
I've read some vague warnings of potential problems with having a long list of 301 redirects within an htaccess file. If this is a problem, could you provide any guidance on how much is too much? And if there is a problem associated with this, what is that problem exactly?
Technical SEO | | roush0