Crawl rate
-
How often does Moz crawl my website ?
(I have a number of issues I believe I have fixed, and wondered if there was a manual request to re-crawl ?)
Thanks.
Austin.
-
Many thanks
-
Perfect, thanks.
-
Thank you, Auston, I was unsure of the number of pages thanks for clearing that up
-
Hi Austin
All campaign updates occur once a week on the same day based on when the campaign was first created. The crawl tests Thomas suggested is helpful as well for instant crawls. Our crawl-test in research tools will limit to 3000 pages.
Cheers!
-
You asked "how often Moz crawls a site" I'm assuming you are asking about a campaign that would be once a week per a campaign & depending on the size/ number of pages of the site
See FAQ
https://moz.com/help/guides/moz-pro-overview/crawl-diagnostics
There are methods you could either create another campaign however my advice would be to use Moz simply crawl test it will let you crawl your site for issues immediately.
https://moz.com/researchtools/crawl-test
That should take care of it, however.
If for whatever reason that does not help some tools are free if your site has less than 500 pages you can use https://www.screamingfrog.co.uk/seo-spider/
Although to get what you want you most likely will have to spend money.
It sounds like crawl test will do what you want. If it does not still needs to be crawled.
Third-party tools like Screaming Frog SEO Spider Pro and deepcrawl.com Are instant and very helpful Additions to the fantastic tools Moz already offers.
I hope this helps,
Tom
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Moz unable to crawl my Zenfolio website
Hey guys, I am attempting to optimize a website for my wife's business but Moz is unable to crawl it. Zenfolio is the web hosting service (she is a photographer). The error message is: **Moz was unable to crawl your site on Apr 1, 2019. **Our crawler was not able to access the robots.txt file on your site. This often occurs because of a server error from the robots.txt. Although this may have been caused by a temporary outage, we recommend making sure your robots.txt file is accessible and that your network and server are working correctly. Typically errors like this should be investigated and fixed by the site webmaster. Read our troubleshooting guide. I did read the troubleshooting guide but nothing worked. My robots.txt file disallows a few bots, but not roger bot. Anyone have any idea what is going on? Or do I need to request server logs from Zenfolio? Thanks
Getting Started | | bpenn111 -
Crawling issue
Hi, I have to set up a campaign for a webshop. This webshop is a subdomain itself. First question: The two subfolders I need to track are /nl_BE and /fr_BE. What is the best way to handle this? Shall I set up two different campaigns for each subfolder, or shall I just make one campaign and add tags to keywords? **Second question: **it seems like Moz can't crawl enough pages. There are no disallows in the robots.txt. Should I try putting the following at the top into my robots.txt? User-agent: rogerbot
Getting Started | | Mat_C
Disallow: Or is it because I want to crawl only a subdomain that it doesn't work? Thanks0 -
Standard Syntax in robots.txt doesn't prevent Moz bot from crawling
A client is getting many false positive site crawl errors for things like duplicate titles and duplicate content on pages that include /tag/ in the URL. An example is https://needquest.com/place_tag/autism-spectrum-disorder/page/4/ To resolve this we have set up a disallow statement in the robots.txt file that says
Getting Started | | btreloar
Disallow: /page/ For some reason this appears not to work, as the site crawl errors continue to list pages like this. Does anyone understand why that would be and what we need to do to properly disallow crawling these pages?0 -
My site is not being fully crawled
Our site has been crawled several times by RogerBot but each time only 6 pages are crawled even though we have more than 100 pages. Do I need to submit my sitemap.xml to Moz?
Getting Started | | Scurri0 -
Crawl Diagnostics Help
Hi there Where can i find my campaigns crawl diagnostics? I need to find where this information can be found and specific issues? Is this possible, i cant seem to find this info. regards Ana
Getting Started | | Starsia200000 -
Crawl Diagnostics
Hello Experts, today i was analyse one of my website with moz and get issue overview and get total 212 issue 37 high all derive to this same url http://blogname.blogspot.com/search?updated-max=2013-10-30T17:59:00%2B05:30&max-results=4&reverse-paginate=true so can anyone help me how to find this url and remove all high priority error. and even on page website get A grade then why not performing well in SE ?
Getting Started | | JulieWhite0 -
Campaign.crawl-seed.bad-response ???
Hello Guys I have just tried to set-up a new campaign for this site http://www.emsababies.co.uk/ HoweverI keep getting this error ?? campaign.crawl-seed.bad-response Anyone know what I am doing wrong ?? Cheers James
Getting Started | | BlueNinja1 -
What are the solutions for Crawl Diagnostics?
Hi Mozers, I am pretty new to SEO and wanted to know what are the solutions for the various errors reported in the crawl diagnostics and if this question has been asked, please guide me in the right directions. Following are queries specific to my site just need help with these 2 only: 1. Error 404: (About 60 errors) : These are for all the PA 1 links and are no longer in the server, what do i do with these? 2. Duplicate Page Content and Title ( About 5000) : Most of these are automatic URL;s that are generated when someone fills any info on our website. What do I do with these URL;s. they are for example: _www.abc.fr/signup.php?_id=001 and then www.abc.fr/signup.php?id=002 and so on. What do I need to do and how? Plzz. Any help would be highly appreciated. I have read a lot on the forums about duplicate content but dont know how to implement this in my case, please advise. Thanks in advance. CY
Getting Started | | Abhi81870