4XX (Client Error)
-
How much will 5 of these errors hurt my search engine ranking for the site itself (ie: the domain) if these 5 pages have this error.
-
not sure if this is any help to anyone but I have almost the same issue but it's a 500 error and the description says:
Traceback (most recent call last): File "build/bdist.linux-x86_64/egg/downpour/init.py", line 391, in _error failure.raiseException() File "/usr/local/lib/python2.7/site-packages/twisted/python/failure.py", line 370, in raiseException raise self.type, self.value, self.tb Error: 500 Internal Server Error
Talking to my hosting provider they said when the seomoz bot crawled my site it put my cpu usage over 25% causing the errors... We will see if this happens on Sunday.
-
One of my crawls has just completed and I see that I have 5 404 : Error messages that display the same error as quoted. I feel that I am being a little pedantic as 5 does seem petty in comparison to the numbers quoted by the other members, I would just like to know if there is something that I can do to eliminate these. Please can you advise if this is something only derived by the moz crawl itself or if it may have something to do with an external cause that I can influence?
I greatly appreciate your time.
-
Thank you!
-
We do know about the 406 errors, and we uploaded a fix for that on Tuesday. Your next crawl should not show these errors again.
Keri
-
I am experiencing exactly the same thing as alsvik-- I went from 0 406 errors to 681 in one week, having changed nothing on my site. Like him, it is PDFs and .jpgs that are generating this error, and I get EXACTLY the same error message.
Clearly, SEOmoz has changed a python script such that their bot no longer accepts these MIME types. Please correct this ASAP.
-
I just got spammed with 406 errors. Seomoz suddenly found 390 of these on my site (all png, jpg and pdf).
I have changed nothing on my site and GWT shows none of these. So i'm thinking that the Seomoz-crawler maybe doing something wrong ...
It all boils down to trust. I trust GWT (it may be slow though).
-
it is on a pdf with a link on it. The error message says:
<dt>Title</dt>
<dd>406 : Error</dd>
<dt>Meta Description</dt>
<dd>Traceback (most recent call last): File "build/bdist.linux-x86_64/egg/downpour/init.py", line 378, in _error failure.raiseException() File "/usr/local/lib/python2.7/site-packages/twisted/python/failure.py", line 370, in raiseException raise self.type, self.value, self.tb Error: 406 Not Acceptable</dd>
<dt>Meta Robots</dt>
<dd>Not present/empty</dd>
<dt>Meta Refresh</dt>
<dd>Not present/empty</dd>
-
It's hard to quantify the impact of the 404 pages not knowing the relative size of your site.
Overall, the 404s aren't good for your SEO. You should work towards fixing the pages that are giving the error or 301 redirecting the bad urls.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Subdomain 403 error
Hi Everyone, A crawler from our SEO tool detects a 403 error from a link from our main domain to a a couple of subdomains. However, these subdomains are perfect accessibly. What could be the problem? Is this error caused by the server, the crawlbot or something else? I would love to hear your thoughts.
Technical SEO | | WeAreDigital_BE
Jens0 -
On March 10 a client's newsroom disappeared out of the SERPS. Any idea why?
For years the newsroom, which is on the subdomain news.davidlerner.com - has ranked #2 for their brand name search. On march 10 it fell out of the SERPs - it is completely gone. What happened? How can I fix this?
Technical SEO | | MeritusMedia0 -
Are the duplicate content and 302 redirects errors negatively affecting ranking in my client's OS Commerce site?
I am working on an OS Commerce site and struggling to get it to rank even for the domain name. Moz is showing a huge number of 302 redirects and duplicate content issues but the web developer claims they can not fix those because ‘that is how the software in which your website is created works’. Have you any experience of OS Commerce? Is it the 302 redirects and duplicate content errors negatively affecting the ranking?
Technical SEO | | Web-Incite0 -
HTTP 500 Internal Server Error, Need help
Hi, For a few days know google crawlers have been getting 500 errors from our dedicated server whenever they try to crawl the site. Using the "Fetch as Google" tool under health in webmaster tools, I get "Unreachable page" every time I fetch the homepage. Here is exactly what the google crawler is getting: <code>HTTP/1.1 500 Internal Server Error Date: Fri, 21 Jun 2013 19:52:27 GMT Server: Apache/2.2.15 (CentOS) X-Powered-By: PHP/5.3.3 X-Pingback: [http://www.communityadvocate.com/xmlrpc.php](http://www.communityadvocate.com/xmlrpc.php) Connection: close Transfer-Encoding: chunked Content-Type: text/html; charset=UTF-8 http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd"> My url is [http://www.communityadvocate.com](http://www.communityadvocate.com/)</code> and here's the screenshot from Goolge webmater http://screencast.com/t/FoWvqRRtmoEQ How can i fix that? Thank you
Technical SEO | | Vmezoz0 -
406 errors
Just started seeing 406 errors on our last crawl (all jpg related). Seomoz found 670 of these on my site when there were 0 before. I have checked the MIME and everything seems to be in the right order. So could it be that Seomoz-crawler is showing errors that aren't really errors?
Technical SEO | | smines0 -
WP Blog Errors
My WP blog is adding my email during the crawl, and I am getting 200+ errors for similar to the following; http://www.cisaz.com/blog/2010/10-reasons-why-microsofts-internet-explorer-dominance-is-ending/tony@cisaz.net "tony@cisaz.net" is added to Every post. Any ideas how I fix it? I am using Yoast Plug in. Thanks Guys!
Technical SEO | | smstv0 -
Google causing Magento Errors
I have an online shop - run using Magento. I have recently upgraded to version 1.4, and I installed a extension called Lightspeed, a caching module which makes tremendous improvements to Magento's performance. Unfortunately, a confoguration problem, meant that I had to disable the module, because it was generating errors relating to the session, if you entered the site from any page other than the home page. The site is now working as expected. I have Magento's error notification set to email - I've not received emails for errors generated by visitors. However over a 72 hour period, I received a deluge of error emails, which where being caused by Googlebot. It was generating an erro in a file called lightspeed.php Here is an example: URL: http://www.jacksgardenstore.com/tahiti-vulcano-hammock IP Address: 66.249.66.186 Time: 2011-06-11 17:02:26 GMT Error: Cannot send headers; headers already sent in /home/jack/jacksgardenstore.com/user/jack_1.4/htdocs/lightspeed.php, line 444 So several things of note: I deleted lightspeed.php from the server, before any of these error messages began to arrive. lightspeed.php was never exposed in the URL, at anytime. It was referred to in a mod_rewrite rule in .htaccess, which I also commented out. If you clicked on the URL in the error message, it loaded in the browser as expected, with no error messages. It appears that Google has cached a version of the page which briefly existed whilst Lightspeed was enabled. But I though that Google cached generated HTML. Since when does cache a server-side PHP file ???? I've just used the Fetch as Googlebot facility on Webmaster Tools for the URL in the above error message, and it returns the page as expected. No errors. I've had to errors at all in the last 48 hours, so I'm hoping it's just sorted itself out. However I'm concerned about any Google related implications. Any insights would be greatly appreciated. Thanks Ben
Technical SEO | | atticus70 -
Google shows the wrong domain for client's homepage
Whenever the homepage of my client's homepage appears in Google results, the search engine is not showing our URL as our domain, but instead a partner domain that is linking to us. (The correct title and meta description of our homepage is showing.) I believe this is caused by the partner website (with a much higher pank rank) linking to our homepage from their footer to a URL with it's own domain that 302 redirects to our homepage. Example: Link: http://www.partnerwebsite.com/?ad2203 302 redirects to: http://www.clientwebsite.com/?moreadtracking The simple fix would be for the client to ask for removal of the 302 hijacking link - but they are uncomfortable with this request since they had requested it prior, and their relationship is not the best. Is there any other way to fix this?
Technical SEO | | Conor_OShea_ETUS0