Error Code 612 with robots.txt 200
-
Hi! I am getting this message Error Code 612: Error response for robots.txt, so the crawler do not check any page of the site. The status code for the robots.txt is 200 and it does not seem Googlebot has any problem crawling the site, so I don't know what the matter is.
The site is http://www.musicopolix.com/
Thanks so much in advance for any help!
-
Perfect.. sounds like "It is also possible you are blocking bots from accessing the page with the host or via htaccess.." was in the right direction!
Cheers,
Jake
-
Hi! Thanks for your answer! We found the problem with the help of Moz support team. Roger was being blocked by our server (http status code 403) but now it is solved and Roger can crawl our site without any problem.
-
I can confirm there is an issue here -- For example: https://webmaster.yandex.com/robots.xml allows you to test robots.txt on your site, and is currently stating it cannot load the file.
Have you checked the server logs to look for any errors/timeouts when crawlers are trying to load the file?
It is also possible you are blocking bots from accessing the page with the host or via htaccess..
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Unsolved Why my website is giving 4xx error
I was analyzing the website link, my website is giving me 4xx error. Google search console in not giving such error 27b42282-a5f6-4ad0-956a-91838633d5ad-image.png Any suggestion will be helpful. The site is on wordpress
Link Explorer | | VS-Gary0 -
Account Error
Hey I have Moz account free trail for 30 days whenever I try to analyze website that is based on it show error please solve my problem.
Link Explorer | | cihiloj7770 -
Why doesn't Moz crawler follow robots.txt?
It is crawling the entire site, and there is stuff we do not want it to. Please advise.
Link Explorer | | Tylerj0 -
803 Errors, how to deal with this?
Hello, During my last two MOZ crawls over couple of hundred 803 errors showed up. Thought that this might stop, but nop still creeping up. Not sure what has caused this. My server providers are WP-Engine and they already said that all good at their end. Pretty much all of those errors are for photos on my blog. I'm a photographer. I have a web guy as well, but he is not sure what to do now and how to get this fixed. Website is a-fotografy.co.uk Thank you and if someone could shed some light. I did research here already, but nothing what cover photo side. Regards, Armands
Link Explorer | | A_Fotografy0 -
Sufficient Words in Content error, despite having more than 300 words
My client has just moved to a new website, and I receive "Sufficient Words in Content" error over all website pages , although there are much more than 300 words in those pages. for example: https://www.assuta.co.il/category/assuta_sperm_bank/ https://www.assuta.co.il/category/international_bank_sperm_donor/ I also see warnings for "Exact Keyword Used in Document at Least Once" although there is use of them in the pages.. The question is why can't MOZ crawler see the pages contents?
Link Explorer | | michalos12210 -
403 errors in Moz but not in Google Search Console
Hello, Moz is showing that one of the sites I manage has about ten 403 errors on main pages, including the home page. But when I go to Google Search Console, I'm not getting any 403 errors. I don't know too much about this site (I handle the SEO for a few sites as a contractor for a digital marketing agency), but I can see that it's a WordPress site (I'm not sure if that's relevant). Can I assume this a Moz issue only? Thanks, Susannah Noel
Link Explorer | | SusannahK.Noel0 -
804 error preventing website being crawled
Hi For both subdomains https://us.sagepub.com and https://uk.sagepub.com crawling is being prevented by a 804 error. I can't see any reason why this should be so as all content is served through https. Thanks
Link Explorer | | philmoorse0 -
How can I get a Moz crawl report of 404 errors on my site
I have a Moz subscription and I see dead links on my website that link externally. Is there a Moz crawl report which will show me these 404 errors and which pages on my site those 404 links are on?
Link Explorer | | Marbanasin0