Error Code 612 with robots.txt 200
-
Hi! I am getting this message Error Code 612: Error response for robots.txt, so the crawler do not check any page of the site. The status code for the robots.txt is 200 and it does not seem Googlebot has any problem crawling the site, so I don't know what the matter is.
The site is http://www.musicopolix.com/
Thanks so much in advance for any help!
-
Perfect.. sounds like "It is also possible you are blocking bots from accessing the page with the host or via htaccess.." was in the right direction!
Cheers,
Jake
-
Hi! Thanks for your answer! We found the problem with the help of Moz support team. Roger was being blocked by our server (http status code 403) but now it is solved and Roger can crawl our site without any problem.
-
I can confirm there is an issue here -- For example: https://webmaster.yandex.com/robots.xml allows you to test robots.txt on your site, and is currently stating it cannot load the file.
Have you checked the server logs to look for any errors/timeouts when crawlers are trying to load the file?
It is also possible you are blocking bots from accessing the page with the host or via htaccess..
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Unsolved Help with my 4xx Errors
Site Crawler has found a range of 4xx errors on my website. But, the urls aren't ones I've created and instead have the handle of my social channels attached to the end - and I've no idea how this has happened. Any tips or insights on how to fix this would be greatly appreciated.! I've attached a screenshot below:
Link Explorer | | Nathantimothy
Screenshot 2024-04-29 at 14.06.32.png0 -
How to Fix Repeating 404 Error on Blog
I've been getting this same 404 Error for a ton of pages on my blog (blog.twowayradiosfor.com) out of nowhere and I can't figure out how to fix it. I have about 500 of them that are experiencing the same issue (as shown in the image I've attached/linked to). It has the correct link, then the part that gets flagged as 404 adds a /TwoWayRadiosFor.com at the end, which is apparently the issue. Is there a reason these have just now appeared even though the blog posts are from years ago? Is there an easy way to fix it? Thanks, Sawyer dF6mUJQ
Link Explorer | | AllChargedUp0 -
OSE error?
Hi, I just started using moz pro, but if i try to check ose, I get this error: There was an error getting your data What's wrong?
Link Explorer | | NielsPNO0 -
Sufficient Words in Content error, despite having more than 300 words
My client has just moved to a new website, and I receive "Sufficient Words in Content" error over all website pages , although there are much more than 300 words in those pages. for example: https://www.assuta.co.il/category/assuta_sperm_bank/ https://www.assuta.co.il/category/international_bank_sperm_donor/ I also see warnings for "Exact Keyword Used in Document at Least Once" although there is use of them in the pages.. The question is why can't MOZ crawler see the pages contents?
Link Explorer | | michalos12210 -
403 errors in Moz but not in Google Search Console
Hello, Moz is showing that one of the sites I manage has about ten 403 errors on main pages, including the home page. But when I go to Google Search Console, I'm not getting any 403 errors. I don't know too much about this site (I handle the SEO for a few sites as a contractor for a digital marketing agency), but I can see that it's a WordPress site (I'm not sure if that's relevant). Can I assume this a Moz issue only? Thanks, Susannah Noel
Link Explorer | | SusannahK.Noel0 -
How do I fix 885 Duplicate Page Content Errors appearing in my Moz Report due to categories?
Hi There, I want to set up my Moz report to send directly to a client however there are currently 885 duplicate page content errors displaying on the report. These are mostly caused by an item listed in multiple 'categories' and each category is a new pages/URL. I guess my questions are: 1. Does Google see these as duplicate page content? Or does it understand the categories are there for navigation purposes. 2. How do I clear these off my Moz report so that the client doesn't panic that there are some major issues on the site Thanks for your advice.
Link Explorer | | skehoe0 -
Incorrect crawl errors
A crawl of my websites has indicated that there are some 5XX server errors on my website: Error Code 608: Page not Decodable as Specified Content Encoding
Link Explorer | | LiamMcArthur
Error Code 803: Incomplete HTTP Response Received
Error Code 803: Incomplete HTTP Response Received
Error Code 608: Page not Decodable as Specified Content Encoding
Error Code 902: Network Errors Prevented Crawler from Contacting Server The five pages in question are all in fact perfectly working pages and are returning HTTP 200 codes. Is this a problem with the Moz crawler?1 -
Remove Meta Description Errors
When Moz shows pages that are 404's you are able to remove them after the issue is fixed. Is there a way to do this for missing Meta Descriptions? It shows that numerous pages are missing them, but when I go look, it's already been corrected and it is still showing months later.
Link Explorer | | seomozinator0