What are the best ways to fix 404 errors?
-
I recently changed the url of my main blog and now have about 100 404 errors. I did a redirect from the old url to the new one however still have errors.
1. Should I do a 301 redirect from each old blog post url to the new blog post url?
2. Should I just delete the old blog post (url) and rewrite the blog post?
I"m not concerned about links to the old posts as a lot of them do not have many links.
-
Thanks Andy. I made this change........domain.com/blue-blog to domain.com/blog using a rewriterule. It seemed to work.
-
I did change the structure domain.com/blue-blog to domain.com/blog. So I did a rewrite rule in the .htaccess file. That fixed a lot of things however there are still 100 or so 404's. They are old blog posts and not really that important.
-
As tom says 404 errors are not the end of the world.
if you are concerned then as long as the relative urls have remained the same and the root directory is all that has changes a 301 in bulk should work, though if you've changed categories or something it may not work so well as a single entity and 100 would be the way to go.
Something that you should do, if you've not already, is within webmaster tools make sure you tell Google you've changed your url (configuration > change address) - it also has a mini guide on the steps you should be taking, including to register your new domain on webmaster tools.
But again, as tom says, if it's not destroying the user experience and isn't a huge annoyance for visitors don't worry too much about it.
--
Just for your reference a full url redirect (aka changing say abc.com to abc.net - moving all directories and urls in one go) would look like
RedirectMatch 301 ^(.*)$ http://www.abc.net
-
Are we talking about a structural change (i.e. domain.com/blog to domain.com/myblog) or a TLD change (domain.com to domain2.com)? If you kept the same blog structure otherwise, I would write a .htaccess file to make sure you just blanket redirect all URLs. It's easy to do that way but not everyone has access to that.
I recommend 301s just because they avoid the sloppiness problem. I mean, you wrote the content for people to find, right? If they hit a 404 it just frustrates them. It doesn't matter whether or not you need the SEO, I like it when a 301 takes me where I really need to go. it shows someone cared enough to make sure I could get to what they had done. It's a pride of authorship thing.
-
Hi Nathan
If you're not concerned about passing the links/link equity of the old posts to a new page, or if you don't think there are any users visiting the URL directly, then I would simply leave the page as a 404 error.
404s are a natural part of the course and Google recognises this - check out this webmaster blog post. 100 404s isn't an awful lot, so I wouldn't worry about them unless they're interrupting a user journey (which you'll be able to check in analytics).
If you really want to get rid of them, then a 301 would be the way to go in my opinion. 100 301s will not slow down your .htaccess file by any noticeable margin. But overall, I'd let the 404s be 404s.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Detecting Real Page as Soft 404 Error
We've migrated my site from HTTP to HTTPS protocols in Sep 2017 but I noticed after migration soft 404 granularly increasing. Example of soft 404 page: https://bit.ly/2xBjy4J But these soft 404 error pages are real pages but Google still detects them as soft 404. When I checked the Google cache it shows me the cache but with HTTP page. We've tried all possible solutions but unable to figure out why Google is still indexing to HTTP pages and detecting HTTPS pages as soft 404 error. Can someone please suggest a solution or possible cause for this issue or anyone same issue like this in past.
Intermediate & Advanced SEO | | bheard0 -
Best SEO Practices for FAQ Page
Hi all, I'm looking for some tips on best practices for FAQ pages. In particular, is it better to have all questions and answers listed on one page, or should each question have its own page - given that there's enough content for it Thanks
Intermediate & Advanced SEO | | brian-madden0 -
Does Google penalise in the way described in this article?
In an interesting article from January on content cannibalisation: https://ninjaoutreach.com/content-cannibalization-avoid/ there is the following paragraph: "When the same keyword is used across a number of pages of a single website, Google’s spiders automatically get directed to a page with low-grade quality which in turn results in the low ranking of all the pages on the website." Is this true? The suggestion here is that they automatically get directed there as a form of penalty. This seems like quite an extraordinary claim! Can anyone verify?
Intermediate & Advanced SEO | | Ad-Rank0 -
What are the Best SEO Website which you read daily
Hai Moz memebers, Can you pls suggest me some best seo websites that you people read articles everyday a part from MOZ
Intermediate & Advanced SEO | | SEO_GB1 -
Best way to show content from articles I am published/featured in
Hi. I was wondering what was the best way to show my audience articles that my client is featured in. My client is specifically a surgeon, who has been referenced in many articles around his specific field of cosmetic surgery. An idea posed is to repost the entire article but just reference back to the original article. Is there an SEO friendly way of doing this? I have seen this done before, like search engine journal's author Larry Kim might repost something he wrote or published on wordstream onto search engine journal sometimes, but makes the reference that it was originally posted on wordstream. I know the standard thinking is to always just write new and unique content, but there is already a good amount written about our client and referencing his work, how can we use this to our advantage and give new or prospecting patients information regarding his credibility? Our client really does not want us to write articles for him, and he does not have the time to write them either. Again Question: How can we leverage articles and studies that have already been published online that is featuring our client and show them in full onto our own website?
Intermediate & Advanced SEO | | Ideas-Money-Art0 -
Legal Client Wants to Change Domain Name... What's the best way to pass authority from old domain?
Hey Mozzers,
Intermediate & Advanced SEO | | WhiteboardCreations
I received a call on Friday from a 6 attorney law office who have been my client for a long time. They have an established brand/domain in their market which isn't very big, but has a lot of competition. 2 of the attorneys are leaving to start their own firm and they want to remove a letter from their name abbreviation, thus their domain name as well. So, the other partners want to change the domain to reflect this. They want to buy a EMD [city]lawyers.com for about $1,600 along with some others to protect their new brand and name. I have a good idea as to what I need to do, BUT would love to hear advice from the community for this type of drastic change. 301 redirects? New Google Analytics code or same just different profile? Webmasters verifications? Content from old site? Old domain forwarding or keep active for a little bit? Is not the time to get them an SSL? Also, what should I prepare them for in terms of website traffic expectations and Google authority drops or remains the same? I know their Moz DA/PA will drop to 1/1, but anything else to look out for? Thank you in advance!
Fellow Pro Member - Patrick1 -
Best support site software to use
Hi Guys We currently use Desk to run our company support site, it seems ok (I don't administer it), however is it very template driven and doesn't allow useful tools such as being able to add metadata to each page (hence in our Moz crawl tests we get a large number of no metadata errors (which seems like a lost opportunity for us to optimise the site). Our support team are looking to implement MadCap Flare as an information management tool, however this tool outputs HTML as iframes which obviously make it hard for google to crawl the content. We recently implemented HubSpot as our content marketing platform which is great, and we'd love to have the support site hosted on this (great for tracking traffic etc), however as far as I'm aware MadCap Flare doesn't integrate directly with HubSpot....so looking for suggestions on what others are successfully using to host/manage their SEO optimised support sites? Cheers Matt
Intermediate & Advanced SEO | | SnapComms0 -
Best way to permanently remove URLs from the Google index?
We have several subdomains we use for testing applications. Even if we block with robots.txt, these subdomains still appear to get indexed (though they show as blocked by robots.txt. I've claimed these subdomains and requested permanent removal, but it appears that after a certain time period (6 months)? Google will re-index (and mark them as blocked by robots.txt). What is the best way to permanently remove these from the index? We can't use login to block because our clients want to be able to view these applications without needing to login. What is the next best solution?
Intermediate & Advanced SEO | | nicole.healthline0