Moz crawler is not able to crawl my website
-
Hello All,
I'm facing an issue with the MOZ Crawler. Every time it crawls my website , there will be an error message saying " **Moz was unable to crawl your site on Sep 13, 2017. **Our crawler was not able to access the robots.txt file on your site. This often occurs because of a server error from the robots.txt. Although this may have been caused by a temporary outage, we recommend making sure your robots.txt file is accessible and that your network and server are working correctly. Typically errors like this should be investigated and fixed by the site webmaster. "
We changed the robots.txt file and checked it . but still the issue is not resolved.
URL : https://www.khadination.shop/robots.txt
Do let me know what went wrong and wjhat needs to be done.
Any suggestion is appreciated.
Thank you.
-
Hi Harini,
Jo from the Moz help team here.
I've had a look at your site and it looks like there is something server side that is blocking our bot.
When I try to cURL your site from our internal tool I'm getting a 302 to http://127.0.0.1
https://screencast.com/t/J3hhDTCM
I'm also seeing this message in this third party tool.
"The robots.txt file does not exist on this domain (302 redirect to http://127.0.0.1)"
All this points to something server side that is initiating a 302 redirect for our bot. While your site looks fine in the browser, our bot simply can't get through.
I would recommend reaching out to your host or web developer to see if they can check how your server is treating rogerbot/1.2
You can also ask them to check the server logs to see how your server is responding to rogerbot/1.2
You'll also want to make sure you are not blocking AWS (Amazon Web Services).
Best of luck!
Jo
-
Thank you Andy . But, the problem is MOZ crawler was unable to crawl the website even though the line " Allow: / " was present in the robots.txt.
User-agent: *
Allow: /
Disallow: /wp-admin/
Disallow: /wp-includes/
Disallow: /?color=
Disallow: /?manufacturer=
Disallow: /?filter_material-fabric=
Disallow: /?filter_color=
Disallow: /?query_type_color=
Disallow: /?filter_size=
Disallow: /?taxonomy=
Disallow: /?view_mode=
Disallow: /?query_type_material-fabric=
Disallow: /?orderby=
Disallow: /?source_id=
Disallow: /?source_tax=
Disallow: /?shop-2__trashed?
Allow: /wp-admin/admin-ajax.phpSitemap: https://www.khadination.shop/sitemap.xml
this was the previous version of robots.txt that were been used ....
-
Hi - As Andy has said, you're not allowing Moz to crawl the site.
Read up on Rogerbot here: https://moz.com/help/guides/moz-procedures/what-is-rogerbot
-
Hi there,
You forgot the most important thing. You're disallowing a lot of things but not allowing access in the first place.
Allow: /
add this on line 2 of your robots.txt file.
Good luck
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Moz Link Explorer for Google Data Studio
I want to put Moz's link explorer into my data studio report. (Including Domain Authority, Linking Domains, Inbound Links, and Ranking Keywords). How can I do this?
Feature Requests | | Kaili0 -
Server blocking crawl bot due to DOS protection and MOZ Help team not responding
First of all has anyone else not received a response from the help team, ive sent 4 emails the oldest one is a month old, and one of our most used features on moz on demand crawl to find broken links doesnt work and its really frustrating to not get a response, when we're paying so much a month for a feature that doesnt work. Ok rant over now onto the actual issue, on our crawls we're just getting 429 errors because our server has a DOS protection and is blocking MOZ's robot, im sure it will be as easy as whitelisting the robots IP, but i cant get a response from MOZ with the IP. Cheers, Fergus
Feature Requests | | JamesDavison0 -
Can I connect my moz crawl to datastudio to list 404s in a table
Hello I currently export 404s and other 'critical' issues to a google sheet and publish them in a monthly report in data studio Is there a way to automate this process so that my monthly report is automatically populated with critical issues
Feature Requests | | Andrew-SEO0 -
Moz Pricing Needs Update
Hi I am coming from Ahrefs and Moz plans need serious update about the limits especially rank tracking. Ahrefs is providing more keywords $0.08/keyword (5000 keywords for the advanced plan) and update interval of 3 days. Moz large plan is $0.13/keyword with intervals of 1 week. Can this be looked into? Hopefully Moz can improve the limits for keywords ranking/tracking. In 2020 where many rank tracking tools provide daily update, 1 week is a bit ridiculously long. Hopefully Moz can make some improvement in this area,
Feature Requests | | EdmondHong0 -
Why my site's DA and Pa Not Show Accurate By MOZ Extension.
my site has some minor problem which about DA and PA that DA and PA do not show accurate some time my site is yts subtitle, please tell me something about it.
Feature Requests | | darkwebvampire0 -
Will the Moz Keyword Explorer Tool (which I LOVE BTW) some day allow you to see search result figures broken down by location?
I know this is a long shot, but are there plans to eventually filter keyword volumes by location like the Google AdWords tool does today? Or is that never going to happen? Assuming not, are there other recommended avenues for teasing out keyword volumes of a local vicinity? I work for a health system, so nearly all of our customers are local. So I always feel like I have to guess on words like "urgent care" since obviously I'm not trying to rank nationally, but often people do not use a geo modifier term when searching.
Feature Requests | | Patrick_at_Nebraska_Medicine0 -
How to reschedule a missed free moz pro walk through with the expert ?
I scheduled a moz pro walk through few days back, but by mistake I chose wrong timings for the same. How Can I rescheduled my free moz pro walk through now ?
Feature Requests | | Anupmehta0 -
Crawl diagnostic errors due to query string
I'm seeing a large amount of duplicate page titles, duplicate content, missing meta descriptions, etc. in my Crawl Diagnostics Report due to URLs' query strings. These pages already have canonical tags, but I know canonical tags aren't considered in MOZ's crawl diagnostic reports and therefore won't reduce the number of reported errors. Is there any way to configure MOZ to not consider query string variants as unique URLs? It's difficult to find a legitimate error among hundreds of these non-errors.
Feature Requests | | jmorehouse0