Status Code: 404 Errors. How to fix them.
-
Hi,
I have a question about the "4xx Staus Code" errors appearing in the Analysis Tool provided by SEOmoz. They are indicated as the worst errors for your site and must be fixed. I get this message from the good people at SEOmoz:
"4xx status codes are shown when the client requests a page that cannot be accessed. This is usually the result of a bad or broken link."
Ok, my question is the following. How do I fix them? Those pages are shown as "404" pages on my site...isn't that enough? How can fix the "4xx status code" errors indicated by SEOmoz?
Thank you very much for your help.
- Sal
-
Why not 301 the 404s to similar pages? Fix your problem AND transfer some of the link juice.
-
As Ben Fox stated, you can use the report to find the linking errors.
I'd also run a scan of your site using Xenu Link Sleuth (it's 100% free) if you're a PC user. Some people prefer Screaming Frog (both work well, Screaming Frog has a free and paid version to my knowledge)
I use Xenu personally, been using it for years with much success. You'd be surprised what kind of stuff it digs up.
-
Hi Sal,
If you look for the referrer column in the report you can see which pages are linking to the broken URLs.
Fix these broken links and you won't be generating so many 4xx pages.
That's the theory anyway. It can be a pretty arduous task but if you stick to it you should be able to get that number down.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Will critical error in wordpress for memory limit affect seo rankings?
will critical error in wordpress to increase memory limit affect seo rankings?
Intermediate & Advanced SEO | | gamstopbet0 -
400+ deleted pictures, 404 on URLs, can they be deleted without penalty?
A client had a website redesign and over 400 pictures (she's a photographer) were lost. The URL's (ending in .png) are hanging out there and according to the site scan, they have a matching current URL. Since the pictures are gone can the URL be deleted?
Intermediate & Advanced SEO | | fcromwell0 -
How old is 404 data from Google Search Console?
I was wondering how old the 404 data from Google Search Console actually is? Does anyone know over what kind of timespan their site 404s data is compiled over? How long do the 404s tend to take to disappear from the Google Search Console, once they are fixed?
Intermediate & Advanced SEO | | McTaggart0 -
How to properly 404 pages from a subdomain
SO I am working on a site that had a subdomain that attracted a lot of spammy links. I researched the backlinks to this subdomain, and there were no beneficial links at all. I am thinking the best thing is to 404 this subdomain. What is the best way to do this? Should I just edit the DNS settings so that this subdomain does not point to the root domain? Or is there something that should be done in webmaster tools? Thanks in advance!
Intermediate & Advanced SEO | | evan890 -
Access denied errors in webmaster tools
I notice today Ihave 2 access denied errors. I checked the help which says: Googlebot couldn’t access a URL on your site because your site requires users to log in to view all or some of your content. (Tip: You can get around this by removing this requirement for user-agent Googlebot.) Therefore I think it may be because I have added a login page for users and googlebot can't access it. I'm using wordpress and presume I need to amend the robots.txt to remove the requirement for google to log in but how do I do that? Unless I'm misunderstanding the problem altogether!
Intermediate & Advanced SEO | | SamCUK0 -
Robots.txt 404 problem
I've just set up a wordpress site with a hosting company who only allow you to install your wordpress site in http://www.myurl.com/folder as opposed to the root folder. I now have the problem that the robots.txt file only works in http://www.myurl./com/folder/robots.txt Of course google is looking for it at http://www.myurl.com/robots.txt and returning a 404 error. How can I get around this? Is there a way to tell google in webmaster tools to use a different path to locate it? I'm stumped?
Intermediate & Advanced SEO | | SamCUK0 -
HON Code Certification--how important is it for a health site?
How important is HON code for a health site? I am working on a client's site (health-related) that was hit by Panda and am wondering if anyone knows of any studies or has any personal experience with Panda & HON code?
Intermediate & Advanced SEO | | nicole.healthline0 -
Fixing Duplicate Content Errors
SEOMOZ Pro is showing some duplicate content errors and wondered the best way to fix them other than re-writing the content. Should I just remove the pages found or should I set up permanent re-directs through to the home page in case there is any link value or visitors on these duplicate pages? Thanks.
Intermediate & Advanced SEO | | benners0