Error 406 with crawler test
-
hi to all. I have a big problem with the crawler of moz on this website: www.edilflagiello.it.
On july with the old version i have no problem and the crawler give me a csv report with all the url but after we changed the new magento theme and restyled the old version, each time i use the crawler, i receive a csv file with this error:
"error 406"
Can you help me to understan wich is the problem? I already have disabled .htacces and robots.txt but nothing.
Website is working well, i have used also screaming frog as well.
-
thank you very much Dirk. this sunday i try to fix all the error and next i will try again. Thanks for your assistance.
-
I noticed that you have a Vary: User-Agent in the header - so I tried visiting your site with js disabled & switched the user agent to Rogerbot. Result: the site did not load (turned endlessly) and checking the console showed quite a number of elements that generated 404's. In the end - there was a timeout.
Try screaming frog - set user agent to Custom and change the values to
Name:Rogerbot
Agent: Mozilla/5.0 (compatible; rogerBot/1.0; UrlCrawler; http://www.seomoz.org/dp/rogerbot)
It will be unable to crawl your site. Check your server configuration - there are issues in how you deal with the Mozbot useragent.
Check the attached images.
Dirk
-
nothing. After i fix all the 40x error the crawler is always empty. Any other ideas?
-
thanks, i'm wait another day
-
I know the Crawl Test reports are cached for about 48 hours so there is a chance that the CSV will look identical to the previous one for that reason.
With that in mind, I'd recommend waiting another day or two before requesting a new Crawl Test or just waiting until your next weekly campaign update, if that is sooner
-
i have fixed all error but csv is always empty and says:
http://www.edilflagiello.it,2015-10-21T13:52:42Z,406 : Received 406 (Not Acceptable) error response for page.,Error attempting to request page
here the printscreen: http://www.webpagetest.org/result/151020_QW_JMP/1/details/
Any ideas? Thanks for your help.
-
thanks a lot guy! I'm going to check this errors before next crawling.
-
Great answer Dirk! Thanks for helping out!
Something else I noticed is that the site is coming back with quite a few errors when I ran it through a 3rd party tool, W3C Markup Validation Service and it also was checking the page as XHTML 1.0 Strict which looks to be common in other cases of 406 I've seen.
-
If you check your page with external tools you'll see that the general status of the page is 200- however there are different elements which generate a 4xx error (your logo generates a 408 error - same for the shopping cart) - for more details you could check this http://www.webpagetest.org/result/151019_29_14E6/1/details/.
Remember that Moz bot is quite sensitive for errors -while browsers, Googlebot & Screaming Frog will accept errors on page, Moz bot stops in case of doubt.
You might want to check the 4xx errors & correct them - normally Moz bot should be able to crawl your site once these errors are corrected. More info on 406 errors can be found here. If you have access to your log files you could check in detail which elements are causing the problems when Mozbot is visiting your site.
Dirk
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Why isn't the Moz crawler getting all of my item pages?
I am stumped and Moz is being terrible to work with. This site has about 40k pages 39,800 of them are item pages roughly. Moz is only finding about 2400 of my pages. So they are missing most but not all of my item pages. I do not know which item pages they are missing. The fact that they are finding about 2k but not the rest leads me to believe the crawler is struggling with pagination. The site is built on Magento 2 and uses the Amasty Layered Navigation extension. Does anyone have any ideas?
Moz Bar | | Tylerj0 -
Crawl tests stuck in queue
I have tried to run a number of crawl tests recently for our client's sites outside the US and they have been stuck in the queue for over a week. 3 of them completed, but then 5 are stuck. Anyone experience this? I haven't seen anything about crawl tests having issues right now.
Moz Bar | | rmcgrath810 -
Moz Crawler fails on the first page
Hi guys, Can anybody shed some light, I'm running a crawl on a client's website but it's failing on the homepage and not crawling any other pages. It appears to be throwing an 804: HTTPS (SSL) error and then terminating the crawl. Now, the page in question was serving up mixed content up until about 4 days ago, but has since been fixed. I read that we should wait at least 48 hours before initiating another crawl to avoid hitting a cached version - which I did, but it still appears to be having issues. Is there anything specific I can do to get around this issue? I'm on a trial account and this feature is one I'm keen to test, so there is a bit of a time constraint. Any help is greatly appreciated! Thanks in advance!
Moz Bar | | philipdanielhayton0 -
500 errors showing up differently on moz and google wmt
Lately, I've been having the issue of a large increase in 500 errors. These errors seem to be intermittent, in other words, Google and Moz are showing that I have server 500 errors for many pages but, when I actually check the links, everything's fine. I've run tests to see if there is any virus on the server or if I have any corrupt files and as far as I can tell, there are none. I'm left with the possibility that maybe one of my plugins is causing this issue (I'm built on top of Wordpress). Moz is showing that I had nearly five hundred 500 server errors on the 12th or the 11th. On the other hand, Google shows that on the 13th I had 179 server errors and then an additional 200 for the 15th. I'm assuming Google is slow to find or report these things? I would like to know which is more reliable so that I can try to figure out which of these plugins may be causing the problem, if any or if I'm investigating this the wrong way, I'd love to have more suggestions. Thanks in advance! Sorry, the url is http://www.heartspm.com if you'd like to take a look.
Moz Bar | | GerryWeitz0 -
Signed up for moz reports - have received Moz error report - need someone who is capable to take report and perform cleanup edits within Joomla site?
Looking for someone in the US - please contact me at Mary@workingwebsolutions.com If available and interested in task. Thanks Mary
Moz Bar | | PortlandWebDesign0 -
I'm getting a Crawl error 605 Page Banned by robots.txt, X-Robots-Tag HTTP Header, or Meta Robots Tag
The website is www.bigbluem.com and is a wordpress site. I'm getting the following error: 605 Page Banned by robots.txt, X-Robots-Tag HTTP Header, or Meta Robots Tag But what is weird is the domain it lists below that is http://None/BigBlueM.com Any advice?
Moz Bar | | TumbleweedPDX1 -
Crawl Test
Hello, Does the Crawl Test having some issues at the moment. It seems so slow. I submitted a website to crawl test 3-4 days ago and still its in progress. This usually only takes 24hrs max. THanks.
Moz Bar | | lueka0 -
Screaming Frog, Moz and other crawlers
Hi Ignorant question, but is it possible to use Screaming Frog or the Moz crawler or any other reputable crawler for a site still in development i.e. it is yet to be indexed? If so, could someone provide some quick instructions on how this can be done. Thanks in advance for any support. Neil
Moz Bar | | mccormackmorrison0