Rate limit
-
Hi,
We'd like to ask for an increased rate limit of 1 request every 5 seconds for the Mozscape API.
Thanks
-
You can also submit and receive feedback on feature requests in the API Feature Requests forum -- and see what other great ideas community members like you have for the API, too!
-
This might not be the best place to request it, but you successfully accomplished the task you had a desire to do. Try this page instead of the Q&A forums: http://www.seomoz.org/about/contact
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What to do with a site of >50,000 pages vs. crawl limit?
What happens if you have a site in your Moz Pro campaign that has more than 50,000 pages? Would it be better to choose a sub-folder of the site to get a thorough look at that sub-folder? I have a few different large government websites that I'm tracking to see how they are fairing in rankings and SEO. They are not my own websites. I want to see how these agencies are doing compared to what the public searches for on technical topics and social issues that the agencies manage. I'm an academic looking at science communication. I am in the process of re-setting up my campaigns to get better data than I have been getting -- I am a newbie to SEO and the campaigns I slapped together a few months ago need to be set up better, such as all on the same day, making sure I've set it to include www or not for what ranks, refining my keywords, etc. I am stumped on what to do about the agency websites being really huge, and what all the options are to get good data in light of the 50,000 page crawl limit. Here is an example of what I mean: To see how EPA is doing in searches related to air quality, ideally I'd track all of EPA's web presence. www.epa.gov has 560,000 pages -- if I put in www.epa.gov for a campaign, what happens with the site having so many more pages than the 50,000 crawl limit? What do I miss out on? Can I "trust" what I get? www.epa.gov/air has only 1450 pages, so if I choose this for what I track in a campaign, the crawl will cover that subfolder completely, and I am getting a complete picture of this air-focused sub-folder ... but (1) I'll miss out on air-related pages in other sub-folders of www.epa.gov, and (2) it seems like I have so much of the 50,000-page crawl limit that I'm not using and could be using. (However, maybe that's not quite true - I'd also be tracking other sites as competitors - e.g. non-profits that advocate in air quality, industry air quality sites - and maybe those competitors count towards the 50,000-page crawl limit and would get me up to the limit? How do the competitors you choose figure into the crawl limit?) Any opinions on which I should do in general on this kind of situation? The small sub-folder vs. the full humongous site vs. is there some other way to go here that I'm not thinking of?
Moz Pro | | scienceisrad0 -
What are the restrictions/limitations to running SEO/Adwords in these countries?
What are the limitations or restrictions to running SEO/Adwords campaigns in countries such as China, South Korea, Japan, Brazil, Portugal, Spain, and Mexico?
Moz Pro | | ThomasCenterInc0 -
Exit Rates and where people go
Hello Mozzers, Where can I analyze where people go after they exit a website in GA. Are there any other free tools/ paid tools that will help me see which site people go to when they bounce of a website. I am specifically in need of help to know where visitor go "outside" of my site without clicking and out bound links that are on the website. Thank you for helping, Grateful to the community as usual, Vijay
Moz Pro | | vijayvasu0 -
Are Moz Ratings that reliable/correlated against SERPS?
So 3 weeks in Penguin 2.0 and after an initial drop in rankings (4-5 places on top keywords from rank 1), I have seen a small climb in rankings by a couple places but have seen a huge jump in my DA, PA and Domain trust/rank... That FAR surpasses any other magicians website. However I am ranking below some really really poorly rated sites. I am getting a PA of 51 which is really high for the industry (maybe the highest). If this is the case, why am I not number one for every keyword in my demographic?
Moz Pro | | TomLondon
I know this is asked a lot but how reliable are Moz stats, I know no one can really know how Google rank sites on SERPS but if Moz is the best shouldn't it be kinda close? Any thoughts on how accurate Moz rank is and also if it can be used as some indication of the SEO power behind a website.0 -
Slowing down SEOmoz Crawl Rate
Is there a way to slow down SEOmoz crawl rate? My site is pretty huge and I'm getting 10k pages crawled every week, which is great. However I sometimes get multiple page requests in one second which slows down my site a bit. If this feature exists I couldn't find it, if it doesn't, it's a great idea to have, in a similar way to how Googlebot do it. Thanks.
Moz Pro | | corwin0 -
Mozcape API Batching URLs LIMIT
Guys, there's an example to batching URLs using PHP: http://apiwiki.seomoz.org/php Which is the maximum number of URLs I can add to that batch?
Moz Pro | | Srvwiz0 -
Number of available links limited?
OK, I've been making use of the free LinkScape API (on behalf of a client of mine) and trying to get links (and info on those links) to a specific domain/page/etc. NOTE : I've been using it without any issue in the past, however we are currently facing some weird issues. Let's take this simple query as an example : http://lsapi.seomoz.com/linkscape/links/wikipedia.org?SourceCols=4&TargetCols=4&Sort=page_authority&Scope=page_to_domain What this one supposedly does is to get links to "wikipedia.org", right? I'm reading : The Page_to_* scopes will by default return 25 links per source domain if no limit is specified, so you can see domain diversity. Due to space limitations in our API, a general link query for a given page will return at most 25 pages for every unique domain linking to that page. And I'm saying OK, that's fine. The thing is that (instead of the 1000 links I had been getting before), I'm now getting just 25 links. NOT per... "source domain"... but obviously per "target domain" (= wikipedia.org) - or am I missing something? (well, probably wikipedia suddenly has just about 25 links pointed to it... makes sense! 🙂 ) Please, let me know what's going on with the above, simply because getting just 25 links is close to worthless... Thanks a lot, in advance!
Moz Pro | | drkameleon0 -
Crawl Rate for Lower Page Authority Websites
Hi,At thumbtack.com we get tons of links from low (or no) page authority websites, and I'm wondering what the crawl rate of those links looks like. I know Google pulls in the web at an astonishing rate, but I'd imagine they aren't re-crawling lower PA very frequently.Are they discovering these links a week after they're posted? A month? More? I spent a while looking around for histograms of actual crawl rates and found surprisingly little. I'd love to see average crawl rate by Domain or Page Authority if that exists anywhere.
Moz Pro | | Thumbtack
Thanks!-MichaelP.S. Here are some random examples of the types of pages with inbound links I'm talking about. Normally we wouldn't spend too much time thinking about these, but there's just so many of them we can't ignore it!- http://www.majestic-cleaners.webs.com/- http://domchieraphotography.blogspot.com/- http://charlottepiano.musicteachershelper.com/- http://pin-upgirlphotography.vpweb.com/default.html- http://jfaithful.weebly.com/0