4XX (Client Error)
-
How much will 5 of these errors hurt my search engine ranking for the site itself (ie: the domain) if these 5 pages have this error.
-
not sure if this is any help to anyone but I have almost the same issue but it's a 500 error and the description says:
Traceback (most recent call last): File "build/bdist.linux-x86_64/egg/downpour/init.py", line 391, in _error failure.raiseException() File "/usr/local/lib/python2.7/site-packages/twisted/python/failure.py", line 370, in raiseException raise self.type, self.value, self.tb Error: 500 Internal Server Error
Talking to my hosting provider they said when the seomoz bot crawled my site it put my cpu usage over 25% causing the errors... We will see if this happens on Sunday.
-
One of my crawls has just completed and I see that I have 5 404 : Error messages that display the same error as quoted. I feel that I am being a little pedantic as 5 does seem petty in comparison to the numbers quoted by the other members, I would just like to know if there is something that I can do to eliminate these. Please can you advise if this is something only derived by the moz crawl itself or if it may have something to do with an external cause that I can influence?
I greatly appreciate your time.
-
Thank you!
-
We do know about the 406 errors, and we uploaded a fix for that on Tuesday. Your next crawl should not show these errors again.
Keri
-
I am experiencing exactly the same thing as alsvik-- I went from 0 406 errors to 681 in one week, having changed nothing on my site. Like him, it is PDFs and .jpgs that are generating this error, and I get EXACTLY the same error message.
Clearly, SEOmoz has changed a python script such that their bot no longer accepts these MIME types. Please correct this ASAP.
-
I just got spammed with 406 errors. Seomoz suddenly found 390 of these on my site (all png, jpg and pdf).
I have changed nothing on my site and GWT shows none of these. So i'm thinking that the Seomoz-crawler maybe doing something wrong ...
It all boils down to trust. I trust GWT (it may be slow though).
-
it is on a pdf with a link on it. The error message says:
<dt>Title</dt>
<dd>406 : Error</dd>
<dt>Meta Description</dt>
<dd>Traceback (most recent call last): File "build/bdist.linux-x86_64/egg/downpour/init.py", line 378, in _error failure.raiseException() File "/usr/local/lib/python2.7/site-packages/twisted/python/failure.py", line 370, in raiseException raise self.type, self.value, self.tb Error: 406 Not Acceptable</dd>
<dt>Meta Robots</dt>
<dd>Not present/empty</dd>
<dt>Meta Refresh</dt>
<dd>Not present/empty</dd>
-
It's hard to quantify the impact of the 404 pages not knowing the relative size of your site.
Overall, the 404s aren't good for your SEO. You should work towards fixing the pages that are giving the error or 301 redirecting the bad urls.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Errors In Search Console
Hi All, I am hoping someone might be able to help with this. Last week one of my sites dropped from mid first day to bottom of page 1. We had not been link building as such and it only seems to of affected a single search term and the ranking page (which happens to be the home page). When I was going through everything I went to search console and in crawl errors there are 2 errors that showed up as detected 3 days before the drop. These are: wp-admin/admin-ajax.php showing as response code 400 and also xmlrpc.php showing as response code 405 robots.txt is as follows: user-agent: * disallow: /wp-admin/ allow: /wp-admin/admin-ajax.php Any help with what is wrong here and how to fix it would be greatly appreciated. Many Thanks
Technical SEO | | DaleZon0 -
Bogus Crawl Errors in Webmaster Tools?
I am suddenly seeing a ton of crawl errors in webmaster tools. Almost all of them are URL links coming from scraper sites.that I do not own. Do you see these in your Webmaster Tools account? Do you mark them as "fixed" if they are on a scraper site? There are waaaay too many of these to make redirects. Thanks!
Technical SEO | | EGOL0 -
Many Errors on E-commerce website mainly Duplicate Content - Advice needed please!
Hi Mozzers, I would need some advice on how to tackle one of my client’s websites. We have just started doing SEO for them and after moz crawled the e-commerce it has detected: 36 329 Errors – 37496 warnings and 2589 Notices all going up! Most of the errors are due to duplicate titles and page content but I cannot identify where the duplicate pages come from, these are the links moz detected of the Duplicate pages (unfortunately I cannot add the website for confidentiality reasons) : • www.thewebsite.com/index.php?dispatch=categories.view&category_id=233&products_per_00&products_per_2&products_per_2&products_per_2&page=2 • www.thewebsite.com/index.php?dispatch=categories.view&category_id=233&products_per_00=&products_per_00&products_per_2&products_per_2&products_per_2&page=2 • www.thewebsite.com/index.php?dispatch=categories.view&category_id=233&products_per_00=&products_per_00&products_per_2&page=2 • www.thewebsite.com/index.php?dispatch=categories.view&category_id=233&products_per_2=&products_per_00&page=2 • www.thewebsite.com/index.php?dispatch=categories.view&category_id=233&products_per_00&products_per_00&products_per_00&products_per_00&page=2 With these URLs it is quite hard to identify which pages need to be canonicalize. And this is jsut an example out of thousands on this website. If anyone would have any advice on how to fix this and how to tackle 37496 errors on a website like this that would be great. Thank you for your time, Lyam
Technical SEO | | AlphaDigital0 -
174 Duplicate Content Errors
How do I go about fixing these errors? There are all related to my tags. Thank you in advance for any help! Lisa
Technical SEO | | lisarein0 -
Seomoz pages error
Hi
Technical SEO | | looktouchfeel
I have a problem with seomoz, it is saying my website http://www.clearviewtraffic.com has page errors on 19,680 pages. Most of the errors are for duplicate page titles. The website itself doesn't even have 100 pages. Does anyone know how I can fix this? Thanks Luke0 -
How to fix errors and warnings on a wordpress.com hosted site ?
Hello Mozers, I've 18 4xx errors ,812 duplicate page content and 412 duplicate page titles with about 605 too many links warning and about 4900 notices.. My website is hosted on wordpress.com and I just do not understand how do i fix these errors . To add on, last week the errors were lesser by 150 !! How do I get these issues fixed ? Please assist !!! Thanks , VIkash
Technical SEO | | mysayindia0 -
My sitemap in Google is coming back with an error
I submitted my xml sitemap to Google Webmaster tools. It is giving an error, not found. 404 Error. But I can't figure out why my site map is signaling a 404. Why? 😞
Technical SEO | | cschwartzel0 -
Massive Increase in 404 Errors in GWT
Last June, we transitioned our site to the Magento platform. When we did so, we naturally got an increase in 404 errors for URLs that were not redirected (for a variety of reasons: we hadn't carried the product for years, Google no longer got the same string when it did a "search" on the site, etc.). We knew these would be there and were completely fine with them. We also got many 404s due to the way Magento had implemented their site map (putting in products that were not visible to customers, including all the different file paths to get to a product even though we use a flat structure, etc.). These were frustrating but we did custom work on the site map and let Google resolve those many, many 440s on its own. Sure enough, a few months went by and GWT started to clear out the 404s. All the poor, nonexistent links from the site map and missing links from the old site - they started disappearing from the crawl notices and we slowly went from some 20k 404s to 4k 404s. Still a lot, but we were getting there. Then, in the last 2 weeks, all of those links started showing up again in GWT and reporting as 404s. Now we have 38k 404s (way more than ever reported). I confirmed that these bad links are not showing up in our site map or anything and I'm really not sure how Google found these again. I know, in general, these 404s don't hurt our site. But it just seems so odd. Is there any chance Google bots just randomly crawled a big ol' list of outdated links it hadn't tried for awhile? And does anyone have any advice for clearing them out?
Technical SEO | | Marketing.SCG0