4XX (Client Error)
-
How much will 5 of these errors hurt my search engine ranking for the site itself (ie: the domain) if these 5 pages have this error.
-
not sure if this is any help to anyone but I have almost the same issue but it's a 500 error and the description says:
Traceback (most recent call last): File "build/bdist.linux-x86_64/egg/downpour/init.py", line 391, in _error failure.raiseException() File "/usr/local/lib/python2.7/site-packages/twisted/python/failure.py", line 370, in raiseException raise self.type, self.value, self.tb Error: 500 Internal Server Error
Talking to my hosting provider they said when the seomoz bot crawled my site it put my cpu usage over 25% causing the errors... We will see if this happens on Sunday.
-
One of my crawls has just completed and I see that I have 5 404 : Error messages that display the same error as quoted. I feel that I am being a little pedantic as 5 does seem petty in comparison to the numbers quoted by the other members, I would just like to know if there is something that I can do to eliminate these. Please can you advise if this is something only derived by the moz crawl itself or if it may have something to do with an external cause that I can influence?
I greatly appreciate your time.
-
Thank you!
-
We do know about the 406 errors, and we uploaded a fix for that on Tuesday. Your next crawl should not show these errors again.
Keri
-
I am experiencing exactly the same thing as alsvik-- I went from 0 406 errors to 681 in one week, having changed nothing on my site. Like him, it is PDFs and .jpgs that are generating this error, and I get EXACTLY the same error message.
Clearly, SEOmoz has changed a python script such that their bot no longer accepts these MIME types. Please correct this ASAP.
-
I just got spammed with 406 errors. Seomoz suddenly found 390 of these on my site (all png, jpg and pdf).
I have changed nothing on my site and GWT shows none of these. So i'm thinking that the Seomoz-crawler maybe doing something wrong ...
It all boils down to trust. I trust GWT (it may be slow though).
-
it is on a pdf with a link on it. The error message says:
<dt>Title</dt>
<dd>406 : Error</dd>
<dt>Meta Description</dt>
<dd>Traceback (most recent call last): File "build/bdist.linux-x86_64/egg/downpour/init.py", line 378, in _error failure.raiseException() File "/usr/local/lib/python2.7/site-packages/twisted/python/failure.py", line 370, in raiseException raise self.type, self.value, self.tb Error: 406 Not Acceptable</dd>
<dt>Meta Robots</dt>
<dd>Not present/empty</dd>
<dt>Meta Refresh</dt>
<dd>Not present/empty</dd>
-
It's hard to quantify the impact of the 404 pages not knowing the relative size of your site.
Overall, the 404s aren't good for your SEO. You should work towards fixing the pages that are giving the error or 301 redirecting the bad urls.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Homepage not ranking main keywords - structure error?
Dear, Moz community We have an issue. We have a classified advertisement website. Our website is built like this**homepage (**Optimized for main keyword, has latest listings from all categories ) - category 1 (we did not want to add alteration of the keyword we want to rank homepage, as we thought this would "compete" with homepage) - category 2 - category 3 - **category **4 The listing URLs look like this www.example.com/categoryname/listingname Now the issue is that the homepage is not ranking at all for the main keywords. When we used URL structure like this "example.com/main-keyword-listing-id" homepage was ranking (other sites). Now with new site we used the best practice and added url's as described above (/categoryname/listingid).This caused our homepage not to rank at all for the main keywords.What did we do wrong? We want our homepage to rank for the main keyword and categories for theirs. Should we 1. Change the category 1 name to main keyword (maybe some long tail) so we have the main keyword in URLs? So at least one of the main categories has the main keyword in the listing URLs2. Should we change the category listing urls all back to /main-keyword-listing-id? We thought that this was a bit spammy, so that´s why we used categories. _This means that all listings have same URL name and not best for ranking cateogries_3. Just link back to homepage internally with the main keyword and google should catch that? _Currently in menu you go to homepage clicking HOME but we can add for example our main keyword there - Latest car advertisements _I would be happy of any feedback.
Technical SEO | | advertisingtech0 -
Can increase in crawl errors in GWT) be caused by input fields and jquery?
Dear Mozzerz We took over www.urgiganten.dk not long ago and last week we opened up for indexation, after having taken the old website down for a couple of months. One week after opening for indexation we saw a huge increase in crawl errors.Google is discovering some weird links to e.g http://www.urgiganten.dk/30-garmin-urremme/ which returns a 404. In GWT we are told that we are linking to this url from http://www.urgiganten.dk/garmin-urremme. But nowhere on http://www.urgiganten.dk/garmin-urremme will you find this link. However you will find the following script in the source code, which is the only code part that contains "/30-garmin-urremme/":Can it be true that google take the id and adds it to our tld to form a url? We have seen quite a lot of these errors not only on Urgiganten.dk but also some of our other websites!
Technical SEO | | urgiganten0 -
Database driven content producing false duplicate content errors
How do I stop the Moz crawler from creating false duplicate content errors. I have yet to submit my website to google crawler because I am waiting to fix all my site optimization issues. Example: contactus.aspx?propid=200, contactus.aspx?propid=201.... these are the same pages but with some old url parameters stuck on them. How do I get Moz and Google not to consider these duplicates. I have looked at http://moz.com/learn/seo/duplicate-content with respect to Rel="canonical" and I think I am just confused. Nick
Technical SEO | | nickcargill0 -
I noticed all my SEOed sites are getting attacked constantly by viruses. I do wordpress sites. Does anyone have a good recommendation to protect my clients sites? thanks
We have tried all different kinds of security plugins but none seem to work long term.
Technical SEO | | Carla_Dawson0 -
Is anyone able to check this 301 redirect for errors please?
Hi, I had a developer write a 301 wildcard for redirecting old hosted site to a new domain. Old URLS looked like /b/2039566/1/akai.html
Technical SEO | | Paul_MC
With varying letters & numbers. I have 26,000 crawl errors in GWT and I can only imagine it's because this is looping?
Can anyone advise if this would be causing grief? Thanks
Paul RewriteCond %{HTTP_HOST} ^vacuumdirect.com.au$ [OR]
RewriteCond %{HTTP_HOST} ^www.vacuumdirect.com.au$
RewriteRule ^/?$ "http://www.vacuumbag.net.au/vacuum-cleaners.html" [R=301,L] <ifmodule mod_rewrite.c="">RewriteEngine On
RewriteBase /
RewriteRule ^p/([0-9]+)/(.*) default/$2 [R=301,L]</ifmodule> <ifmodule mod_rewrite.c="">RewriteEngine On
RewriteBase /
RewriteRule ^c/([0-9]+)/1/(.*) default/vacuum-bags/vacuum-cleaner-bags-$2 [R=301,L]</ifmodule> <ifmodule mod_rewrite.c="">RewriteEngine On
RewriteBase /
RewriteRule ^p/([0-9]+)/(.*) $2 [R=301,L]</ifmodule> <ifmodule mod_rewrite.c="">RewriteEngine On
RewriteBase /
RewriteRule ^c/([0-9]+)/(.*) default/$2 [R=301,L]</ifmodule>0 -
I am getting an error message from Google Webmaster Tools and I don't know what to do to correct the problem
The message is:
Technical SEO | | whitegyr
"Dear site owner or webmaster of http://www.whitegyr.com/, We've detected that some of your site's pages may be using techniques that are outside Google's Webmaster Guidelines. If you have any questions about how to resolve this issue, please see our Webmaster Help Forum for support. Sincerely, Google Search Quality Team" I have always tried to follow Google's guidelines and don't know what I am doing wrong, I have eight different websites all getting this warning and I don't know what is wrong, is there anyone you know that will look at my sites and advise me what I need to do to correct the problem? Website with this warning:
artistalaska.com
cosmeticshandbook.com
homewindpower.ws
montanalandsale.com
outdoorpizzaoven.net
shoes-place.com
silverstatepost.com
www.whitegyr.com0 -
Why is it that in the exported CSV there are no refrerring pages shown for 404 errors?
Within some of my campaigns i can see issues regarding 404 pages. Then when i export the data to a csv, sometimes the referring pages that lead tot the 404 are not shown. Am i missing something here?
Technical SEO | | 5MMedia0 -
Unknown "/" added causing 404 error
I have four 404 url redirect errors that I cannot sort out. It tells me the referring url: | www.homedestination.com/calculator-mortgage-resources.html has a "/" on the end. cannot find: | www.homedestination.com/calculator-mortgage-resources.html | I cannot figure out where this referring url is; as it is in the root file without a "/" on the end. Could it be on a page somewhere? All my Dreamweaver page link tests come back ok. I must be missing something simple and would value help for others who may spot it? Thanks! |
Technical SEO | | jessential0