How do batched URL metrics work in terms of rows and rate limit?
-
I am using the free API plan to get URL metrics and batching my calls like this:
https://github.com/seomoz/SEOmozAPISamples/blob/master/php/batching_urls_sample.php
How does this work in terms of rows and limits? If I do a batch of 10 urls does it count as 1 row? or 10? Do I have to wait 10 seconds before calling the next batch?
-
Hi there - Kristina from Moz's Help Team here.
A batch of 10 URLs will count as 10 rows and you would need to wait 10 seconds before making the next query. If you make your next query too quickly you will be hit with a throttle which lasts 30 seconds. If you make another query during that period, you potentially could push that 30 second throttle even further out.
I hope this helps! In the future, feel free to email our team anytime at help@moz.com and someone from our team will be happy to help out!
Have a great day.
-Kristina
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Moz not working properly
I'm using MOZ BAR extension . It's not working properly on this https://brokescholar.com Can anyone tell me that is it/not showing da...
API | | sathoiue80 -
What does "ffspl" mean in a url-metrics result set?
So I've got this server-side JScript object I've declared which works nicely for us. var mozObj = new MozInteraction()
API | | StevePoul
.setMethod("GET")
.setHost("http://lsapi.seomoz.com")
.setPath("linkscape/url-metrics")
.setCols(parseInt("11111111111111111111111111111111111111111111111111111", 2))
.setLines(10)
.setAccessId(accessId)
.setSecret(secret)
.setExpires((new Date()).addDays(1).valueOf())
.generateSignature(); etc etc The setCols call sets the Cols value to the maximum possible: 9007199254740991. The weird thing is that in the JSON response I get a whole pile of column names that I can't find descriptions for in the documentation at https://moz.com/help/links-api/making-calls/response-fields e.g. ffspl1 ffspl2 ffspl3 ... fuspl0 fuspl1 fuspl2 fuspl3 ... pfspl0 pfspl1 pfspl2 pfspl3 ... puspl0 puspl1 puspl2 puspl3... etc1 -
MozBar not working and Help is asking for to login
I can not use MozBar, receive a pop up about API credentials and when I go to Moz Help am asked to login but my credentials do not work. moz-login-issue-2016.09.28.png
API | | Stacious0 -
Mozcheck.com not working with API, anyone else having this problem?
We have been using MozCheck.com with our API for 3 years, today it stopped working. Our account is in good standing, nothing has been changed.
API | | troytlb0 -
What happens if I go over my Mozscape api free limits?
Hello,
API | | FPK
I just started using the free version of Mozscape and I fully understand there are limits and charges under this category. However to avoid any costly surprises, I like to know:
What happens when I get near my usage limit?
What happens when I just hit the limit?
What happens when I past the limit? Along with my questions, Is there any alert systems to let me know when I get to the range of said limit, like an email?0 -
Suggestion - How to improve OSE metrics for DA & PA
I am sure everyone is aware at Moz, that although the Moz link metrics ( primarily I am talking about DA & PA) are good, there is a lot of room for improvement, and that there are a lot of areas where the metric values given to some types of site are well out of whack with what their "real" values should be. Some examples
API | | James77
www.somuch.com (Link Directory) - DA 72
www.articlesbase.com (Article Directory) - DA 89
www.ezinearticles.com (Article Directory) - DA 91 I'm sure everyone would agree that links from these domains are not as powerful (if of any value at all), as their DA would suggest, and therefore by definition of how moz metrics work, the sites these have links from such sites are also inflated - thus they throw the whole link graph out of whack. I have 2 suggestions which could be used to singularly or in conjunction (and obviously with other factors that Moz use to calculate DA and PA) which could help move these values to what they should more realistically be. 1/. Incorporate rank values.
This is effectively using rank values to reverse engine what google (or other engines) as a "value" on a website. This could be achieved (if moz were not to build the data gathering system itself), by intergrating with a company that already provides this data - eg searchmetrics, semrush etc. As an example you would take a domian and pull in some rank values eg http://www.semrush.com/info/somuch.com?db=us - where you could use traffic, traffic price, traffic history as a metric as part of the overall Moz scoring alogrithm. As you can see from my example according to SEMRush the amount of traffic and traffic price is extreamly low for what you would expect of a website that has a DA of 72. Likewise you will find this for the other two sites and similarly to pretty much any other site you will test. This is essentially because your tapping into Googles own ranking factors, and thereby more inline with what real values (according to Google) are with respect to the quality of a website. Therefore if you were to incorporate these values, I believe you could improve the Moz metrics. 2/. Social Sharing Value
Another strong indicator of quality the amount of social sharing of a document or website as a whole, and again you will find as with my examples, that pages on these sites have low social metrics in comparison to what you would normally associate with sites of these DA values. Obviously to do this you would need to pull social metrics of all the pages in your link DB. Or if this we to tech intense to achieve, again work with a partner such as searchmetrics, which provide "Total Social Interations" on a domain level basis. Divide this value by the number of Moz crawled pages and you would have a crude value of the overall average social scorability of a webpage on a given site. Obviously both the above, do have their flaws if you looked at them in complete isolation, however in combination they could provide a robust metric to use in any alogrithm, and in combination with current moz values used in the alogrithm I believe you could make big strides into improving overall Moz metrics.1