Back to the "We were unable to access your site due to a page timeout on your robots.txt".
Could it be the sitemap.xml page specified in the robots.txt is too slow?
Sitemap: https://www.kpmg.us/sitemap.xml
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
Back to the "We were unable to access your site due to a page timeout on your robots.txt".
Could it be the sitemap.xml page specified in the robots.txt is too slow?
Sitemap: https://www.kpmg.us/sitemap.xml
OK. Got a different error: Your site crawl timed out due to a slow server response. Passing this along to IT.
We fixed the situation where the robots.txt files download (see: https://www.kpmg.us/robots.txt) but rogerbot still cannot crawl the site due to some "timeout" issue on the robots.txt.
Hmmm, seems all our robots.txt files download as text files. But the others (ex: advisory.kpmg.us/robots.txt) work with rogerbot. I've asked our IT folk to see how were serving .txt files.
Site: www.kpmg.us
Getting robots.txt timeout fail since 02/29/20. We've checked our server logs and see no errors. Went through all the steps of the "Troubleshooter".
Updated robots.txt to allow rogerbot full access:
User-agent: rogerbot
Disallow:
Any ideas how to get roger to crawl my site????