Set crawl frequency
-
Current crawl frequency is weekly, is it possible for me to set this frequency our-self?
-
This post is five years old, and I haven't worked as a Moz employee in over two years. If you email help@moz.com, they should be able to help you.
-
Can you please provide the right URL? This URL is out dated and there is no redirect... Isn't that bad SEO?
-
If you go to the URL in my first response, the instructions there will explain how it works. Have you tried visiting that URL?
-
Can you please tell me the steps for that.
Thank you.
-
You can run the crawl on demand for up to 3000 urls at a time. That should help you out between regular crawls.
-
Now i made some error correction based on the crawl report, Now i have to wait for a week for the crawl to happen to see that the errors are gone.
Is there a way for me to verify correction or regenerate the error report?
-
The crawl frequency is weekly and cannot be changed. What you can do is run a crawl test from http://pro.seomoz.org/tools/crawl-test if you need a crawl on demand.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Why did Moz crawl our development site?
In our Moz Pro account we have one campaign set up to track our main domain. This week Moz threw up around 400 new crawl errors, 99% of which were meta noindex issues. What happened was that somehow Moz found the development/staging site and decided to crawl that. I have no idea how it was able to do this - the robots.txt is set to disallow all and there is password protection on the site. It looks like Moz ignored the robots.txt, but I still don't have any idea how it was able to do a crawl - it should have received a 401 Forbidden and not gone any further. How do I a) clean this up without going through and manually ignoring each issue, and b) stop this from happening again? Thanks!
Moz Pro | | MultiTimeMachine0 -
Duplicate content in crawl despite canonical
Hi! I've had a bunch of duplicate content issues come up in a crawl, but a lot of them seem to have canonical tags implemented correctly. For example: http://www.alwayshobbies.com/brands/aztec-imports/-catg=Fireplaces http://www.alwayshobbies.com/brands/aztec-imports/-catg=Nursery http://www.alwayshobbies.com/brands/aztec-imports/-catg=Turntables http://www.alwayshobbies.com/brands/aztec-imports/-catg=Turntables?page=0 Aztec http://www.alwayshobbies.com/brands/aztec-imports/-catg=Turntables?page=1 Any ideas on what's happening here?
Moz Pro | | neooptic0 -
How to make a Crawl Report readable?
Hi! I am trying to find out how I make my CSV report neat, so I can interprete the report. I know have a CSV report with just numbers and text all in one column. I tried the button text to columns but that doesn't work because when I do that at column A it overwrites column B in which I have the same problem! Thanks
Moz Pro | | HetCommunicatielokaal0 -
My Campaign only crawled 3 pages on my site
On my first crawl of a new campaign, the software only crawled 3 pages. XXXaceXXXscholarships.org any ideas?
Moz Pro | | Santaur0 -
How do we use SEOmoz to track Local Searches? Is there a way to set the location from which the campaign tracker is "searching"?
It seems that there is no way to set a parameter for location. In Places, I'm able to define my targeted region. How does SEOmoz mimic that localization? Thank for any help! -- Chris
Moz Pro | | ChrisPalle1 -
A suggestion to help with linkscape crawling and data processing
Since you guys are understandably struggling with crawling and processing the sheer number of URLs and links, I came up with this idea: In a similar way to how SETI@Home (is that still a thing? Google says yes: http://setiathome.ssl.berkeley.edu/) works, could SEOmoz use distributed computing amongst SEO moz users to help with the data processing? Would people be happy to offer up their idle processor time and (optionally) internet connections to get more accurate, broader data? Are there enough users of the data to make distributed computing worthwhile? Perhaps those who crunched the most data each month could receive moz points or a free month of Pro. I have submitted this as a suggestion here:
Moz Pro | | seanmccauley
http://seomoz.zendesk.com/entries/20458998-crowd-source-linkscape-data-processing-and-crawling-in-a-similar-way-to-seti-home1 -
How to crawl the whole domain?
Hi, I have a website an e-commerce website with more than 4.600 products. I expect that Seomoz scan check all url's. I don't know why this doesn't happens. The Campaign name is Artigos para festa and should scan the whole domain festaexpress.com. But it crels only 100 pages I even tried to create a new campaign named Festa Express - Root Domain to check if it scans but had the same problem it crawled only 199 pages. Hope to have a solution. Thanks,
Moz Pro | | EduardoCoen
Eduardo0