Mozcape API Batching URLs LIMIT
-
Guys, there's an example to batching URLs using PHP:
Which is the maximum number of URLs I can add to that batch?
-
Yes, it's weird. I currently have the Pro plan. I'm doing queries using 200 URLs at the same time with no issues ;). The only limitation is the time: I have to make a query and wait for 10 seconds to perform another
This request exceeds the limit allowed by your current plan.
Thank you Zach, have a good day!
-
As far as I know the limit was 10. That article on the APIWiki says the same. I do know, as a premium subscriber, that the # of batch requests per second is 200, however.
Quote from the API Wiki:
"You can submit up to 10 URLs for every batch request. Larger batch requests will return an HTTP 400 response."
I'd just be careful, because if your not getting a 400 response, they may end up throttling you.
Hope this helps
Zach -
Thanks Zachary. I made a test adding a lot of URLs. SEOmoz says the limit is 200 or less URLs at the same time. So, what I have to use... the 10 URLs limit or the 200?
Currently Im able to get data of 200 URLs at the same time, that's great for me!
-
SEOmoz recommends batch requests of 10 URLs according to their API wiki http://apiwiki.seomoz.org/url-metrics it states that any batch request larger than this will output a 400 error from the server.
Hope that helps!
Zach
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What to do with a site of >50,000 pages vs. crawl limit?
What happens if you have a site in your Moz Pro campaign that has more than 50,000 pages? Would it be better to choose a sub-folder of the site to get a thorough look at that sub-folder? I have a few different large government websites that I'm tracking to see how they are fairing in rankings and SEO. They are not my own websites. I want to see how these agencies are doing compared to what the public searches for on technical topics and social issues that the agencies manage. I'm an academic looking at science communication. I am in the process of re-setting up my campaigns to get better data than I have been getting -- I am a newbie to SEO and the campaigns I slapped together a few months ago need to be set up better, such as all on the same day, making sure I've set it to include www or not for what ranks, refining my keywords, etc. I am stumped on what to do about the agency websites being really huge, and what all the options are to get good data in light of the 50,000 page crawl limit. Here is an example of what I mean: To see how EPA is doing in searches related to air quality, ideally I'd track all of EPA's web presence. www.epa.gov has 560,000 pages -- if I put in www.epa.gov for a campaign, what happens with the site having so many more pages than the 50,000 crawl limit? What do I miss out on? Can I "trust" what I get? www.epa.gov/air has only 1450 pages, so if I choose this for what I track in a campaign, the crawl will cover that subfolder completely, and I am getting a complete picture of this air-focused sub-folder ... but (1) I'll miss out on air-related pages in other sub-folders of www.epa.gov, and (2) it seems like I have so much of the 50,000-page crawl limit that I'm not using and could be using. (However, maybe that's not quite true - I'd also be tracking other sites as competitors - e.g. non-profits that advocate in air quality, industry air quality sites - and maybe those competitors count towards the 50,000-page crawl limit and would get me up to the limit? How do the competitors you choose figure into the crawl limit?) Any opinions on which I should do in general on this kind of situation? The small sub-folder vs. the full humongous site vs. is there some other way to go here that I'm not thinking of?
Moz Pro | | scienceisrad0 -
Thoughts on scraping SERPs and APIs
I'm perplexed about a couple of things. Google says that scraping keyword rankings is against their policy from what I've read. Bummer. We comprise a lot of reports and manual finding and entry was a pain. Enter Moz! We still manually check and compare, but it's nice having that tool. I'm confused now though about practices and getting SERPs in an automated way. Here are my questions Is it against policy to get SERPs from an automated method? If that is the case, isn't Moz breaking this policy with it's awesome keyword tracker? If it's not, and we wanted to grab that kind of data, how would we do it? Right now, Moz's API doesn't offer this data. I thought Raven Tools at one point offered this, but they don't now from what I've read. Are there any APIs out there that we can grab this data and do what we want with it? (let's day build our own dashboard)? Thanks for any clarification and input!
Moz Pro | | Boogily1 -
Magento: Moz finding URL and URL?p=1 as duplicate. Solution?
Good day Mozzers! Moz bot is finding URL's in the Catalogue pages with the format www.example.com/something and www.example.com/something?p=1 as duplicate (since they are the same page) Whats the best solution to implement here? Canonical? Any other? Cheers! MozAddict
Moz Pro | | MozAddict0 -
Wher is the API id and key? I tried generating it several times but I get nothing...?
Wher is the API id and key? I tried generating it several times but I get nothing...?
Moz Pro | | breezego0 -
Why does it keep displaying br tags and claiming 404 errors on like 4 of my URL's for all my Wordpress sites?
Is anyone else having the same issue? These errors don't actually exist and i think it has something to do with wordpress - how can i fix this?
Moz Pro | | MillerPR0 -
Can I calculate "Keyword Difficulty" metric using Mozscape API data?
We already have a web application that pulls certain metrics about websites using the Mozscape API, but we are wanting to extend the usefulness of this application to enable users of the app to pull "Keyword Difficulty" metrics in bulk, instead of one at a time (or 5 at a time). I wouldn't mind the 5 at a time limitation if we could just automate the API calls and let the tool pull data for 50 or so keywords without user-interaction. I know that it's a "formula", but I don't know what SEOMoz uses for it's formula. Has anyone figured out a way to calculate this, based on the Mozscape API data? Has anyone ever tried to reverse engineer this metric?
Moz Pro | | brchap0 -
Crawl reports urls with duplicate content but its not the case
Hi guys!
Moz Pro | | MakMour
Some hours ago I received my crawl report. I noticed several records with urls with duplicate content so I went to open those urls one by one.
Not one of those urls were really with duplicate content but I have a concern because website is about product showcase and many articles are just images with href behind them. Many of those articles are using the same images so maybe thats why the seomoz crawler duplicate content flag is raised. I wonder if Google has problem with that too. See for yourself how it looks like: http://by.vg/NJ97y
http://by.vg/BQypE Those two url's are flagged as duplicates...please mind the language(Greek) and try to focus on the urls and content. ps: my example is simplified just for the purpose of my question. <colgroup><col width="3436"></colgroup>
| URLs with Duplicate Page Content (up to 5) |0 -
Limit of 3 competitors per campaign
Hello, Can I just check why the number of competitors per campaign is limited to 3? ie. We have up to 20 URLs we want to monitor for the same keywords as us. I understand I can add another campaign to track another 4x URLs, but this will not show those in relation to our main URL/campaign. We would prefer to have space for 12 competitors rather than 12 campaigns. If we upgrade to Elite is the number increased? Might be missing something here, as we are new to SEOMoz, so grateful for any advice. SEOMoz is really great, we have learned so much already. Rich
Moz Pro | | STL0