I created a custom map using google maps creator and I embedded it on our site. However, when I ran the fetch and render through Search Console, it said it was blocked by our robots.txt file. I read in the Search Console Help section that: 'For resources blocked by robots.txt files that you don't own, reach out to the resource site owners and ask them to unblock those resources to Googlebot."
I did not setup our robtos.txt file. However, I can't imagine it would be setup to block google from crawling a map. i will look into that, but before I go messing with it (since I'm not familiar with it) does google automatically block their maps from their own googlebot? Has anyone encountered this before?
Here is what the robot.txt file says in Search Console:
User-agent: *
Allow: /maps/api/js?
Allow: /maps/api/js/DirectionsService.Route
Allow: /maps/api/js/DistanceMatrixService.GetDistanceMatrix
Allow: /maps/api/js/ElevationService.GetElevationForLine
Allow: /maps/api/js/GeocodeService.Search
Allow: /maps/api/js/KmlOverlayService.GetFeature
Allow: /maps/api/js/KmlOverlayService.GetOverlays
Allow: /maps/api/js/LayersService.GetFeature
Disallow: /
Any assistance would be greatly appreciated.
Thanks,
Ruben