Mozcape API Batching URLs LIMIT
-
Guys, there's an example to batching URLs using PHP:
Which is the maximum number of URLs I can add to that batch?
-
Yes, it's weird. I currently have the Pro plan. I'm doing queries using 200 URLs at the same time with no issues ;). The only limitation is the time: I have to make a query and wait for 10 seconds to perform another
This request exceeds the limit allowed by your current plan.
Thank you Zach, have a good day!
-
As far as I know the limit was 10. That article on the APIWiki says the same. I do know, as a premium subscriber, that the # of batch requests per second is 200, however.
Quote from the API Wiki:
"You can submit up to 10 URLs for every batch request. Larger batch requests will return an HTTP 400 response."
I'd just be careful, because if your not getting a 400 response, they may end up throttling you.
Hope this helps
Zach -
Thanks Zachary. I made a test adding a lot of URLs. SEOmoz says the limit is 200 or less URLs at the same time. So, what I have to use... the 10 URLs limit or the 200?
Currently Im able to get data of 200 URLs at the same time, that's great for me!
-
SEOmoz recommends batch requests of 10 URLs according to their API wiki http://apiwiki.seomoz.org/url-metrics it states that any batch request larger than this will output a 400 error from the server.
Hope that helps!
Zach
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Difference between urls and referring urls?
Sorry, nit new to this side of SEO We recently discovered we have over 200 critical crawler issues on our site (mainly 4xx) We exported the CSV and it shows both a URL link and a referring URL. Both lead to a 'page not found' so I have two questions? What is the difference between a URL and a referring URL? What is the best practice/how do we fix this issue? Is it one for our web developer? Appreciate the help.
Moz Pro | | ayrutd1 -
What to do with a site of >50,000 pages vs. crawl limit?
What happens if you have a site in your Moz Pro campaign that has more than 50,000 pages? Would it be better to choose a sub-folder of the site to get a thorough look at that sub-folder? I have a few different large government websites that I'm tracking to see how they are fairing in rankings and SEO. They are not my own websites. I want to see how these agencies are doing compared to what the public searches for on technical topics and social issues that the agencies manage. I'm an academic looking at science communication. I am in the process of re-setting up my campaigns to get better data than I have been getting -- I am a newbie to SEO and the campaigns I slapped together a few months ago need to be set up better, such as all on the same day, making sure I've set it to include www or not for what ranks, refining my keywords, etc. I am stumped on what to do about the agency websites being really huge, and what all the options are to get good data in light of the 50,000 page crawl limit. Here is an example of what I mean: To see how EPA is doing in searches related to air quality, ideally I'd track all of EPA's web presence. www.epa.gov has 560,000 pages -- if I put in www.epa.gov for a campaign, what happens with the site having so many more pages than the 50,000 crawl limit? What do I miss out on? Can I "trust" what I get? www.epa.gov/air has only 1450 pages, so if I choose this for what I track in a campaign, the crawl will cover that subfolder completely, and I am getting a complete picture of this air-focused sub-folder ... but (1) I'll miss out on air-related pages in other sub-folders of www.epa.gov, and (2) it seems like I have so much of the 50,000-page crawl limit that I'm not using and could be using. (However, maybe that's not quite true - I'd also be tracking other sites as competitors - e.g. non-profits that advocate in air quality, industry air quality sites - and maybe those competitors count towards the 50,000-page crawl limit and would get me up to the limit? How do the competitors you choose figure into the crawl limit?) Any opinions on which I should do in general on this kind of situation? The small sub-folder vs. the full humongous site vs. is there some other way to go here that I'm not thinking of?
Moz Pro | | scienceisrad0 -
Help with URL parameters in the SEOmoz crawl diagnostics Error report
The crawl diagnostics error report is showing tons of duplicate page titles for my pages that have filtering parameters. These parameters are blocked inside Google and Bing webmaster tools. I do I block them within the SEOmoz crawl diagnostics report?
Moz Pro | | SunshineNYC0 -
API Request Rate Problem
Hi there A Java app of mine worked perfectly until recently, when it gets "rate exceeds your current plan" errors from the API, although it does only 1 request in 10 seconds. Any idea what's wrong? Cheers, Chris
Moz Pro | | Diderino0 -
Opensiteexplorer details in API
Is there a way to access opensiteexplorer result details via seomoz's API? Most interested in the social media results
Moz Pro | | fuzzlepop0 -
Garbled URL's in Private Messages.
Every time I try to put a url in a Private message it gets garbled up with extra chars and then won't go to the right place. http://www.facebook.com/pages/Mariah-Carle-Photography Becomes: http://www.facebook.com/pages/Mariah58973jhsdfui-Carle%8594743Photography Ok after that test I deliberately garbled a url and it STILL work in the open forums....
Moz Pro | | Mcarle0 -
If you only had a limited budget for tools...
With a limited budget for SEO tools, what would you suggest having that would cover everything. Lets say with a £200 a month budget. We only look after ourselves, which is around 5 accounts in terms of PPC, so SEO follows similar suit. Is SEOmoz effective enough to cover all of our basis. Are there bits it is missing that are better else where? What are your thoughts basically and what are the benefits of each tool. Thanks in advance
Moz Pro | | esendex1 -
We were unable to grade that page. We received a response code of 301\. URL content not parseable
I am using seomoz webapp tool for my SEO on my site. I have run into this issue. Please see the attached file as it has the screen scrape of the error. I am running an on page scan from seomoz for the following url: http://www.racquetsource.com/squash-racquets-s/95.htm When I run the scan I receive the following error: We were unable to grade that page. We received a response code of 301. URL content not parseable. This page had worked previously. I have tried to verify my 301 redirects and am unable to resolve this error. I can perform other on page scans and they work fine. Is this a known problem with this tool? I have verified ensuring I don't have it defined. Any help would be appreciated.
Moz Pro | | GeoffBatterham0