What are the best ways to fix 404 errors?
-
I recently changed the url of my main blog and now have about 100 404 errors. I did a redirect from the old url to the new one however still have errors.
1. Should I do a 301 redirect from each old blog post url to the new blog post url?
2. Should I just delete the old blog post (url) and rewrite the blog post?
I"m not concerned about links to the old posts as a lot of them do not have many links.
-
Thanks Andy. I made this change........domain.com/blue-blog to domain.com/blog using a rewriterule. It seemed to work.
-
I did change the structure domain.com/blue-blog to domain.com/blog. So I did a rewrite rule in the .htaccess file. That fixed a lot of things however there are still 100 or so 404's. They are old blog posts and not really that important.
-
As tom says 404 errors are not the end of the world.
if you are concerned then as long as the relative urls have remained the same and the root directory is all that has changes a 301 in bulk should work, though if you've changed categories or something it may not work so well as a single entity and 100 would be the way to go.
Something that you should do, if you've not already, is within webmaster tools make sure you tell Google you've changed your url (configuration > change address) - it also has a mini guide on the steps you should be taking, including to register your new domain on webmaster tools.
But again, as tom says, if it's not destroying the user experience and isn't a huge annoyance for visitors don't worry too much about it.
--
Just for your reference a full url redirect (aka changing say abc.com to abc.net - moving all directories and urls in one go) would look like
RedirectMatch 301 ^(.*)$ http://www.abc.net
-
Are we talking about a structural change (i.e. domain.com/blog to domain.com/myblog) or a TLD change (domain.com to domain2.com)? If you kept the same blog structure otherwise, I would write a .htaccess file to make sure you just blanket redirect all URLs. It's easy to do that way but not everyone has access to that.
I recommend 301s just because they avoid the sloppiness problem. I mean, you wrote the content for people to find, right? If they hit a 404 it just frustrates them. It doesn't matter whether or not you need the SEO, I like it when a 301 takes me where I really need to go. it shows someone cared enough to make sure I could get to what they had done. It's a pride of authorship thing.
-
Hi Nathan
If you're not concerned about passing the links/link equity of the old posts to a new page, or if you don't think there are any users visiting the URL directly, then I would simply leave the page as a 404 error.
404s are a natural part of the course and Google recognises this - check out this webmaster blog post. 100 404s isn't an awful lot, so I wouldn't worry about them unless they're interrupting a user journey (which you'll be able to check in analytics).
If you really want to get rid of them, then a 301 would be the way to go in my opinion. 100 301s will not slow down your .htaccess file by any noticeable margin. But overall, I'd let the 404s be 404s.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
E-commerce System without error page
I´d love to know your thoughts about this particular issue: Vtex is top3 e-commerce system in brazil. ( issue is huge) the system do not use 4XX responde codes If there is a error page, they just redirect it to a search page with 200 code. in Google index we can find a lot of "empty" pages ( indexed error pagess) We can´t use noindex for them Example:
Intermediate & Advanced SEO | | SeoMartin1
http://www.taniabulhoes.com.br/this-is-a-test
OR
http://www.taniabulhoes.com.br/thisisatest Any suggestions?0 -
Page A Best for Users, but B Ranks
This is real estate MLS listings related. I have a page "B" with lots of unique content (MLS thumbnails mixed with guide overview writing, pictures etc) which outranks "A" which is a page simply showing MLS thumbnails with map feature included. I am linking from "B" to "A" with anchor "KEYWORD for sale" to indicate to search engines that "A" is the page I want to rank, even though "B" has more unique content. It hasn't worked so far.
Intermediate & Advanced SEO | | khi5
Questions: Should I avoid linking from "B" to "A" as that could impact how well "B" ranks? Should I leave this setup and over time hope search engines will give "A" a chance to rank? Include some unique content on "A" mostly not viewable without clicking "Read more" link? I don't foresee many users will click "Read more" as they are really just looking for the properties for sale and do rarely care about written material when searching for "KEYWORD for sale". Should I "no index, follow" A as there are limited to none unique content and this could enhance chance of ranking better for B? When I write blog posts and it includes "KEYWORD for sale" should I link to "A" (best for users) or link to "B" since that page has more potential to rank really well and still is fairly good for users? Ranking for "B" is not creating a large bounce rate, just that "A" is even better. Thank you,
Kristian0 -
Best practice to avoid cannibalization of internal pages
Hi everyone, I need help from the best SEO guys regarding a common issue : the cannibalization of internal pages between each other. Here is the case : Let's say I run the website CasualGames.com. This website provides free games, as well as articles and general presentation about given categories of Casual Games. For instance, for the category "Sudoku Games", the structure will be : Home page of the game : http://www.casualgames.com/sudoku/ Free sudoku game listings : (around 100 games listed) http://www.casualgames.com/sudoku/free/ A particular sudoku game : http://www.casualgames.com/sudoku/free/game-1/ A news regarding sudoku games : http://www.casualgames.com/sudoku/news/title The problem is that these pages seem to "cannibalize" each other. Explanation : In the SERPS, for the keyword "Casual Games", the home page doesn't appear well ranked and some specific sudoku games page (one of the 100 games) are better ranked although they are "sub-pages" of the category.. Same for the news pages : a few are better ranked than the category page.. I am kind of lost.. Any idea what would be the best practice in this situation? THANKS a LOT.
Intermediate & Advanced SEO | | laboiteac
Guillaume0 -
Duplicate Content Error because of passed through variables
Hi everyone... When getting our weekly crawl of our site from SEOMoz, we are getting errors for duplicate content. We generate pages dynamically based on variables we carry through the URL's, like: http://www.example123.com/fun/life/1084.php
Intermediate & Advanced SEO | | CTSupp
http://www.example123.com/fun/life/1084.php?top=true ie, ?top=true is the variable being passed through. We are a large site (approx 7000 pages) so obviously we are getting many of these duplicate content errors in the SEOMoz report. Question: Are the search engines also penalizing for duplicate content based on variables being passed through? Thanks!0 -
Best way to SEO crowdsourcing site
What is the best way to SEO a crowdsourcing site? The websites content is entirely propagated by the user
Intermediate & Advanced SEO | | StreetwiseReports0 -
Squarespace Errors
We have a website hosted by SquareSpace. We are happy with SS, but have done some crawl diagnostics and noticed several errors. These are primarily: Duplicate Page Title Duplicate Page Content Client Error (4xx) We dont really understand why these errors are taking place, and wonder if someone in the Seomoz forum has a firm understanding of SS who is able to assist us with this? rainforestcruises.com thanks.
Intermediate & Advanced SEO | | RainforestCruises0 -
Best way to migrate to a new URL structure
Hello everyone, We’re changing our URL structure from something like this: example.com/index.php?language=English To something like this: example.com**/english/**index.php The change is implemented with mod_rewrite so all the old URLs can still work We have hundreds of thousands of pages that are currently indexed with the old URL structure What’s the best way to get Google to rapidly update its index and to maintain as much ranking as possible? 301 redirect all the old URLs to the new equivalent format? If we detect that the URL is in an old format, render the page with a canonical tag pointing to the new equivalent format as well as adding a noindex, nofollow tag? Something else? Thanks for your input!
Intermediate & Advanced SEO | | anthematic0 -
Do 404 Errors hurt SEO ?
I recently did a crawl test and it listed over 10,000 pages, but around 282 of them generated 404 errors (bad links) I'm wondering how much this hurts overall SEO and if its something I should focus on fixing asap ? Thanks.
Intermediate & Advanced SEO | | RagingBull0