Crawl rate
-
How often does Moz crawl my website ?
(I have a number of issues I believe I have fixed, and wondered if there was a manual request to re-crawl ?)
Thanks.
Austin.
-
Many thanks
-
Perfect, thanks.
-
Thank you, Auston, I was unsure of the number of pages thanks for clearing that up
-
Hi Austin
All campaign updates occur once a week on the same day based on when the campaign was first created. The crawl tests Thomas suggested is helpful as well for instant crawls. Our crawl-test in research tools will limit to 3000 pages.
Cheers!
-
You asked "how often Moz crawls a site" I'm assuming you are asking about a campaign that would be once a week per a campaign & depending on the size/ number of pages of the site
See FAQ
https://moz.com/help/guides/moz-pro-overview/crawl-diagnostics
There are methods you could either create another campaign however my advice would be to use Moz simply crawl test it will let you crawl your site for issues immediately.
https://moz.com/researchtools/crawl-test
That should take care of it, however.
If for whatever reason that does not help some tools are free if your site has less than 500 pages you can use https://www.screamingfrog.co.uk/seo-spider/
Although to get what you want you most likely will have to spend money.
It sounds like crawl test will do what you want. If it does not still needs to be crawled.
Third-party tools like Screaming Frog SEO Spider Pro and deepcrawl.com Are instant and very helpful Additions to the fantastic tools Moz already offers.
I hope this helps,
Tom
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
I’d like to set up a Moz campaign that crawls just the primary website, not subdomains
Is this something you could help with? (it either bombs or crawls everything, so I assume I’m missing something in the campaign settings, or it’s just not possible.
Getting Started | | blueprintatl0 -
Moz unable to crawl my Zenfolio website
Hey guys, I am attempting to optimize a website for my wife's business but Moz is unable to crawl it. Zenfolio is the web hosting service (she is a photographer). The error message is: **Moz was unable to crawl your site on Apr 1, 2019. **Our crawler was not able to access the robots.txt file on your site. This often occurs because of a server error from the robots.txt. Although this may have been caused by a temporary outage, we recommend making sure your robots.txt file is accessible and that your network and server are working correctly. Typically errors like this should be investigated and fixed by the site webmaster. Read our troubleshooting guide. I did read the troubleshooting guide but nothing worked. My robots.txt file disallows a few bots, but not roger bot. Anyone have any idea what is going on? Or do I need to request server logs from Zenfolio? Thanks
Getting Started | | bpenn111 -
Crawling issue
Hi, I have to set up a campaign for a webshop. This webshop is a subdomain itself. First question: The two subfolders I need to track are /nl_BE and /fr_BE. What is the best way to handle this? Shall I set up two different campaigns for each subfolder, or shall I just make one campaign and add tags to keywords? **Second question: **it seems like Moz can't crawl enough pages. There are no disallows in the robots.txt. Should I try putting the following at the top into my robots.txt? User-agent: rogerbot
Getting Started | | Mat_C
Disallow: Or is it because I want to crawl only a subdomain that it doesn't work? Thanks0 -
When I crawl my site On Moz it says it can't access the robots.txt file, but crawl is fine on SEM Rush - Anyone know any reason for this?
Hi guys, When I try to run a site crawl on Moz it returns an error saying that it has failed due to an error with the robots.txt file. However, my site can be crawled by SEM Rush with no mention of problems with roots.txt file issues. My developer has looked into it and insists their is no problem with my robots.txt and I've tried the Moz crawl at least 6 times over an 8 week period. Has anyone ever seen such a large discrepancy between Moz and SEM Rush or have any ideas why Moz has this issue with my site?? TIA everyone
Getting Started | | Webreviewadmin0 -
Scheduled update - Re-Crawl - recrawl
Can I not perform a manual update? I setup a campaign without GA as I did not have access, I got access, added the GA account to the campaign but no data is showing as I think I require an update, but have to wait 7 days? Is that right? Thanks
Getting Started | | SJMDT0 -
Moz could not crawl my httpS website
Hi, we have a website with HTTPS, moz could not crawl it and we get "902 : Network errors prevented crawler from contacting server for page" while in logs we see moz robot access but fail after some seconds, what could be the problem, while moz can access site when it is without httpS | 902 : Network errors prevented crawler from contacting server for page. |
Getting Started | | Hamedkhorasani10 -
Can MOZ crawl our website twice in a week?
I want to generate MOZ crawl errors report twice in a week. Is it possible to do that.
Getting Started | | chandman0 -
What are the solutions for Crawl Diagnostics?
Hi Mozers, I am pretty new to SEO and wanted to know what are the solutions for the various errors reported in the crawl diagnostics and if this question has been asked, please guide me in the right directions. Following are queries specific to my site just need help with these 2 only: 1. Error 404: (About 60 errors) : These are for all the PA 1 links and are no longer in the server, what do i do with these? 2. Duplicate Page Content and Title ( About 5000) : Most of these are automatic URL;s that are generated when someone fills any info on our website. What do I do with these URL;s. they are for example: _www.abc.fr/signup.php?_id=001 and then www.abc.fr/signup.php?id=002 and so on. What do I need to do and how? Plzz. Any help would be highly appreciated. I have read a lot on the forums about duplicate content but dont know how to implement this in my case, please advise. Thanks in advance. CY
Getting Started | | Abhi81870