4XX (Client Error)
-
How much will 5 of these errors hurt my search engine ranking for the site itself (ie: the domain) if these 5 pages have this error.
-
not sure if this is any help to anyone but I have almost the same issue but it's a 500 error and the description says:
Traceback (most recent call last): File "build/bdist.linux-x86_64/egg/downpour/init.py", line 391, in _error failure.raiseException() File "/usr/local/lib/python2.7/site-packages/twisted/python/failure.py", line 370, in raiseException raise self.type, self.value, self.tb Error: 500 Internal Server Error
Talking to my hosting provider they said when the seomoz bot crawled my site it put my cpu usage over 25% causing the errors... We will see if this happens on Sunday.
-
One of my crawls has just completed and I see that I have 5 404 : Error messages that display the same error as quoted. I feel that I am being a little pedantic as 5 does seem petty in comparison to the numbers quoted by the other members, I would just like to know if there is something that I can do to eliminate these. Please can you advise if this is something only derived by the moz crawl itself or if it may have something to do with an external cause that I can influence?
I greatly appreciate your time.
-
Thank you!
-
We do know about the 406 errors, and we uploaded a fix for that on Tuesday. Your next crawl should not show these errors again.
Keri
-
I am experiencing exactly the same thing as alsvik-- I went from 0 406 errors to 681 in one week, having changed nothing on my site. Like him, it is PDFs and .jpgs that are generating this error, and I get EXACTLY the same error message.
Clearly, SEOmoz has changed a python script such that their bot no longer accepts these MIME types. Please correct this ASAP.
-
I just got spammed with 406 errors. Seomoz suddenly found 390 of these on my site (all png, jpg and pdf).
I have changed nothing on my site and GWT shows none of these. So i'm thinking that the Seomoz-crawler maybe doing something wrong ...
It all boils down to trust. I trust GWT (it may be slow though).
-
it is on a pdf with a link on it. The error message says:
<dt>Title</dt>
<dd>406 : Error</dd>
<dt>Meta Description</dt>
<dd>Traceback (most recent call last): File "build/bdist.linux-x86_64/egg/downpour/init.py", line 378, in _error failure.raiseException() File "/usr/local/lib/python2.7/site-packages/twisted/python/failure.py", line 370, in raiseException raise self.type, self.value, self.tb Error: 406 Not Acceptable</dd>
<dt>Meta Robots</dt>
<dd>Not present/empty</dd>
<dt>Meta Refresh</dt>
<dd>Not present/empty</dd>
-
It's hard to quantify the impact of the 404 pages not knowing the relative size of your site.
Overall, the 404s aren't good for your SEO. You should work towards fixing the pages that are giving the error or 301 redirecting the bad urls.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Weird 404 errors in Webmaster Tools
Hi, In a regular check with Webmaster Tools, I have noticed some weird 404 errors, for example, my domain URL is something like http://domainname.com/, the 404 error points to some weird URLs like http://domainname.com/james-bond&page=2/ and http://domainname.com/juegos-de&page=3/, at first I have tried to block them by robots.txt, but now I am getting these kind of 404 errors a lot, and don't think blocking them all is a perfect solution. Can anyone help me out with the issue? Thank you in advance.
Technical SEO | | nishthaj
cheers.0 -
404 errors
Hi I am getting these show up in WMT crawl error any help would be very much appreciated | ?ecaped_fragment=Meditation-find-peace-within/csso/55991bd90cf2efdf74ec3f60 | 404 | 12/5/15 |
Technical SEO | | ReSEOlve
| | 2 | mobile/?escaped_fragment= | 404 | 10/26/15 |
| | 3 | ?escaped_fragment=Tips-for-a-balanced-lifestyle/csso/1 | 404 | 12/1/15 |
| | 4 | ?escaped_fragment=My-favorite-yoga-spot/csso/5598e2130cf2585ebcde3b9a | 404 | 12/1/15 |
| | 5 | ?escaped_fragment=blog/c19s6 | 404 | 11/29/15 |
| | 6 | ?escaped_fragment=blog/c19s6/Tag/yoga | 404 | 11/30/15 |
| | 7 | ?escaped_fragment=Inhale-exhale-and-once-again/csso/2 | 404 | 11/27/15 |
| | 8 | ?escaped_fragment=classes/covl | 404 | 10/29/15 |
| | 9 | m/?escaped_fragment= | 404 | 10/26/15 |
| | 10 | ?escaped_fragment=blog/c19s6/Page/1 | 404 | 11/30/15 | | |0 -
Duplicate Page Title Error passing a php variable
Hi i've searched about this and read about this and i can't get my head around it and could really do with some help. I have a lot of contact buttons which all lead to the same enquiry form and dependant on where it has come from it fills in the enquiry field on the contact form. For example if you are on the airport transfer page it will carry the value so its prefilled in (.php?prt=Airport Transfers). The problem is it's coming up as a duplicate page however its just the 1. I have this problem with quite a few sites and really need to combat this issue. Any help would be very much appreciated. airport-transfers.php
Technical SEO | | i7Creative0 -
GWT crawl errors: How big a ranking issue?
For family reasons (child to look after) I can't keep a close eye on my SEO and SERPs. But from top 10 rankings in January for a dozen keywords I'm now not in top 80 results -- save one keyword for which I'm ~18-20.
Technical SEO | | Jeepster
Not a sitewide penalty: some of my internal pages are still ranking top 3 or so. In GWT, late March I received warning of a rise in server errors:
17 Server Errors/575 soft 404s/17 Not Founds/Access Denied 1/Others 4
I've also got 2 very old sitemaps (from two different ex-SEO firms) & I'm guessing about 75% of the links on there no longer exist. Q: Could all this be behind my calamitous SERPS drop? Or should I be devoting my -- limited -- time to improving my links?0 -
How do crawl errors from SEOmoz tool set effect rankings?
Hello - The other day I presented the crawl diagnostic report to a client. We identified duplicate page title errors, missing meta description errors, and duplicate content errors. After reviewing the report we presented it to the clients web company who operates a closed source CMS. Their response was that these errors are not worthy of fixing and in fact they are not hurting the site. We are having issues getting the errors fixed and I would like your opinion on this matter. My question is, how bad are these errors? Should we not fix them? Should they be fixed? Will fixing the errors have an impact on our site's rankings? Personally, I think the question is silly. I mean, the errors were found using the SEOmoz tool kit, these errors have to be effecting SEO.....right? The attached image is the result of the Crawl Diagnostics that crawled 1,400 pages. NOTE: Most of the errors are coming from Pages like blog/archive/2011-07/page-2 /blog/category/xxxxx-xxxxxx-xxxxxxx/page-2 testimonials/147/xxxxx--xxxxx (xxxx represents information unique to the client) Thanks for your insight! c9Q33.png
Technical SEO | | Gabe0 -
How to avoid 404 errors when taking a page off?
So... We are running a blog that was supposed to have great content. Working at SEO for a while, I discovered that is too much keyword stuffing and some SEO shits for wordpress, that was supposed to rank better. In fact. That worked, but I'm not getting the risk of getting slaped by the Google puppy-panda. So we decided to restard our blog from zero and make a better try. So. Every page was already ranking in Google. SEOMoz didn't make the crawl yet, but I'm really sure that the crawlers would say that there is a lot of 404 errors. My question is: can I avoid these errors with some tool in Google Webmasters in sitemaps, or shoud I make some rel=canonicals or 301 redirects. Does Google penalyses me for that? It's kinda obvious for me that the answer is YES. Please, help 😉
Technical SEO | | ivan.precisodisso0 -
4XX Broken Links
I am attempting to fix the issues SEOmoz found when crawling my site. I have a list of 4XX errors that I am attempting to fix. Basically I know one option is to redirect them to another page, but I would like to have the option to remove the links completely. The only problem is I can not find where the links are located. Does SEOmoz provide where on my site these broken links are? Or do they only provide the url that is linked to?
Technical SEO | | ClaytonKendall0 -
404 errors on a 301'd page
I current have a site that when run though a site map tool (screaming frog or xenu) returns a 404 error on a number of pages The pages are indexed in Google and when visited they do 301 to the correct page? why would the sitemap tool be giving me a different result? is it not reading the page correctly?
Technical SEO | | EAOM0