Two days ago I found out through search console that my website's Robots.txt has changed to
User-agent: *
Disallow: /
When I check the robots.txt in the website it looks fine - I see its blocked just in search console( in the robots.txt tester).
when I try to do fetch as google to the homepage I see its blocked. Any ideas why would robots.txt block my website? it was fine until the weekend.
- before that, in the last 3 months I saw I had blocked resources in the website and I brought back pages with fetch as google.
Any ideas?