Error Code 612 with robots.txt 200
-
Hi! I am getting this message Error Code 612: Error response for robots.txt, so the crawler do not check any page of the site. The status code for the robots.txt is 200 and it does not seem Googlebot has any problem crawling the site, so I don't know what the matter is.
The site is http://www.musicopolix.com/
Thanks so much in advance for any help!
-
Perfect.. sounds like "It is also possible you are blocking bots from accessing the page with the host or via htaccess.." was in the right direction!
Cheers,
Jake
-
Hi! Thanks for your answer! We found the problem with the help of Moz support team. Roger was being blocked by our server (http status code 403) but now it is solved and Roger can crawl our site without any problem.
-
I can confirm there is an issue here -- For example: https://webmaster.yandex.com/robots.xml allows you to test robots.txt on your site, and is currently stating it cannot load the file.
Have you checked the server logs to look for any errors/timeouts when crawlers are trying to load the file?
It is also possible you are blocking bots from accessing the page with the host or via htaccess..
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Unsolved Why my website is giving 4xx error
I was analyzing the website link, my website is giving me 4xx error. Google search console in not giving such error 27b42282-a5f6-4ad0-956a-91838633d5ad-image.png Any suggestion will be helpful. The site is on wordpress
Link Explorer | | VS-Gary0 -
How to Fix Repeating 404 Error on Blog
I've been getting this same 404 Error for a ton of pages on my blog (blog.twowayradiosfor.com) out of nowhere and I can't figure out how to fix it. I have about 500 of them that are experiencing the same issue (as shown in the image I've attached/linked to). It has the correct link, then the part that gets flagged as 404 adds a /TwoWayRadiosFor.com at the end, which is apparently the issue. Is there a reason these have just now appeared even though the blog posts are from years ago? Is there an easy way to fix it? Thanks, Sawyer dF6mUJQ
Link Explorer | | AllChargedUp0 -
404 Error - Please Help us
When we checked valuable Top pages, we noticed two types of 404 pages listed in our domain Example 1 : www.test.com/www.test.com/ecommerce.html Example 2 : www.test.com/test/ecommerce.html But we do not see any such 404 page errors in the Google Webmaster tool. Moz Top Pages Section only shows these as errors So please advise, if these are major errors or not? If these are errors, please help us to fix this as we do not have such URLs in our domain Awaiting your urgent help
Link Explorer | | Intellect0 -
612 : Page banned by error response for robots.txt
Hi all,
Link Explorer | | ME5OTU
I ran a crawl on my site https://www.drbillsukala.com.au and received the following error "612 : Page banned by error response for robots.txt." Before anyone mentions it, yes, I have been through all the other threads but they did not help me resolve this issue. I am able to view my robots.txt file in a browser https://www.drbillsukala.com.au/robots.txt.
The permissions are set to 644 on the robots.txt file so it should be accessible
My Google Search Console does not show any issues with my robots.txt file
I am running my site through StackPath CDN but I'm not inclined to think that's the culprit One thing I did find odd is that even though I put in my website with https protocol (I double checked), on the Moz spreadsheet it listed my site with http protocol. I'd welcome any feedback you might have. Thanks in advance for your help.
Kind regards0 -
How do I fix 885 Duplicate Page Content Errors appearing in my Moz Report due to categories?
Hi There, I want to set up my Moz report to send directly to a client however there are currently 885 duplicate page content errors displaying on the report. These are mostly caused by an item listed in multiple 'categories' and each category is a new pages/URL. I guess my questions are: 1. Does Google see these as duplicate page content? Or does it understand the categories are there for navigation purposes. 2. How do I clear these off my Moz report so that the client doesn't panic that there are some major issues on the site Thanks for your advice.
Link Explorer | | skehoe0 -
Open site explorer: error message "There was an error getting your data"
I"m trying out MOZ, wanting to make an informed decision (for myself and my blogging students) whether this is a useful program to buy and use. I've had no luck so far with the Open Site Explorer. Each time I try, I get the same error message, "There was an error getting your data". I used the URL: http://writetodone.com I asked about the problem in the chat, but my question was posed over an hour ago and is still marked 'unread'... I"m keen to continue studying the program as I'm creating a video about MOZ. How can I make the Open Site Explorer work? I've tried the Analytics module but it seems to take some days to get results. Mary "There was an error getting your data"
Link Explorer | | MaryJaksch0 -
Error message coming up for Open Site Explorer
When in Open Site Explorer there seems to be an error getting the data for http://vagabondtoursofireland.ie/ or www.vagabondtoursofireland.ie I have used this with other websites and have never had a problem. Thanks.
Link Explorer | | Johnny_AppleSeed0