Moz crawling
-
Hi Everyone!
I'm new to the SEOMoz and wanted to find out if there is a way to decrease the waiting time for the campaign crawl. I have made a lot of changes based on the first crawl and would like to see how these are reflected on the reports, but can't until the next crawl is performed.
Any help would be greatly appreciated.
-
Thanks Keri - much appreciated.
-
The crawl test is separate from the campaigns (as you can run it on any subdomain as you wish). The crawls are two subdomains in 24 hours. Hope this helps!
Keri
-
Also - forgotten to ask - the report generated by the Crawl Test - is it going to be displayed with the relevant campaign or is it separate from the campaigns?
-
Hi Keri,
Thanks for the prompt reply.
How does it work with those credits - I can see on that page that there were (as I've just used one) 2 Crawl credits. Are these credits reset every day so that I can perform 2 crawls a day?
-
You might try a custom crawl, which will crawl up to 3000 pages and should take a few hours, up to 24 hours. It's available at http://pro.seomoz.org/tools/crawl-test.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Why my website internal links on moz is 0?
My Web site: https://www.basictrailers.com.au/ I have been using Moz for almost 3 months, from the start to now, the internal links are always 0. However, if I check the links on my Google Search Console, it is over 1500. So what is the problem? If Moz is trying to make its algorithm similar to Google, why that they even cannot find my internal links. That is ridiculous. 0fhTnQG
Moz Pro | | Lilycharlie2 -
Is there a easy way to see what pages are crawled?
Hello! Like the questions says... Is there a easy way to see what pages are crawled? I don't mean the ones that have issues, but just the ones that have been crawled? Regards,
Moz Pro | | MattDG0 -
Initiate crawl
Anyway to start the crawl of a site immediately after changes have been made? Or must you wait for the next scheduled crawl? Thanks.
Moz Pro | | dave_whatsthebigidea.com0 -
Third crawl of my sites back to 250 pages
Hi all, I've been waiting some days for the third crawl of my sites, but SEOMOZ only crawled 277 pages. The next phrase appeared on my crawl report: Pages Crawled: 277 | Limit: 250 My last 2 crawls were of about 10K limit. Any idea? Kind regards, Simon.
Moz Pro | | Aureka0 -
Question About Moz Reports
Hey guys Is it possible to change the date/time that SEOmoz runs its ranking reports? I put together ranking reports each Monday which I need to feedback to the business and it would be good if I could schedule these for late Sunday or early Monday mornings for me coming in to the office.
Moz Pro | | EwanFisher0 -
Only one page has been crawled
I am running a campaing for three weeks now and first two crawls was ok but the last one is showing only one page crawled. the subdomain I am tracking is: www.cubaenmiami.com I have everything correct in my site. Regards Alex
Moz Pro | | esencia0 -
A suggestion to help with linkscape crawling and data processing
Since you guys are understandably struggling with crawling and processing the sheer number of URLs and links, I came up with this idea: In a similar way to how SETI@Home (is that still a thing? Google says yes: http://setiathome.ssl.berkeley.edu/) works, could SEOmoz use distributed computing amongst SEO moz users to help with the data processing? Would people be happy to offer up their idle processor time and (optionally) internet connections to get more accurate, broader data? Are there enough users of the data to make distributed computing worthwhile? Perhaps those who crunched the most data each month could receive moz points or a free month of Pro. I have submitted this as a suggestion here:
Moz Pro | | seanmccauley
http://seomoz.zendesk.com/entries/20458998-crowd-source-linkscape-data-processing-and-crawling-in-a-similar-way-to-seti-home1 -
Is there any way to view crawl errors historically?
One of the website's we monitor have been getting high duplicate page titles, as we work through the pages, we see changes and the number of duplicate page titles are decreasing. However, lately, it went up again and the duplicate page titles have increased. I wanted to ask if there's any way to view the new errors and the old errors separately or sorted in a way that can help me identify why we are getting new page crawl errors. Any advice would be great. Thanks!
Moz Pro | | TheNorthernOffice790