Can anyone recommend a reliable proxy service (paid or otherwise) to tunnel scraping requests through?
I've been using free proxes which are sometimes a bit slow/ timeout or just refuse connections.
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
Can anyone recommend a reliable proxy service (paid or otherwise) to tunnel scraping requests through?
I've been using free proxes which are sometimes a bit slow/ timeout or just refuse connections.
Hi,
Sorry I think my question was misunderstood.
I'm trying to automate the process of collecting mozTrust and other metrics for a list of URLs for prospecting. mozRank and Domain Authority is fine, it's just mozTrust that's a bit tricky to pull in - short of scraping opensiteexplorer
Cheers
I've got a list of URLs which I need to get URL+domain mozTrust for.
Short of systematicly scraping opensiteexplorer.org with my PRO log-in details.. is there any (free) way to do this?
- The seoMoz API seems to only give mozTrust for it's paid API (not looking for that level of detail in the paid API yet)
Cheers
That's brilliant Casey - that's exactly what I was looking for!
Basicly Cols = bitflag
So to dump all data for anchor-text Cols=2042
Thanks!
Hi,
I'm testing out the SEOmoz API - however I'm stuggling to understand the use of the Cols parameter within the "anchor-text" method.
I've looped through increasing numbers of "Cols" for a standard query and there just seems to be no logical pattern.
** - Could someone please enlighten me as to how this works?**
E.g. of results for query:
1Array
(
[0] => Array
(
[aturid] => 86128451138
)
[1] => Array
(
[aturid] => 86128451144
)
[2] => Array
(
[aturid] => 86128451131
)
)
2Array
(
[0] => Array
(
[atut] => seomoz
)
[1] => Array
(
[atut] => seomoz.org
)
[2] => Array
(
[atut] => seo
)
)
3Array
(
[0] => Array
(
[aturid] => 86128451138
[atut] => seomoz
)
[1] => Array
(
[aturid] => 86128451144
[atut] => seomoz.org
)
[2] => Array
(
[aturid] => 86128451131
[atut] => seo
)
)
4Array
(
[0] => Array
(
[atui] => 38845159274
)
[1] => Array
(
[atui] => 38845159274
)
[2] => Array
(
[atui] => 38845159274
)
)
5Array
(
[0] => Array
(
[atui] => 38845159274
[aturid] => 86128451138
)
[1] => Array
(
[atui] => 38845159274
[aturid] => 86128451144
)
[2] => Array
(
[atui] => 38845159274
[aturid] => 86128451131
)
)
6Array
(
[0] => Array
(
[atui] => 38845159274
[atut] => seomoz
)
[1] => Array
(
[atui] => 38845159274
[atut] => seomoz.org
)
[2] => Array
(
[atui] => 38845159274
[atut] => seo
)
)
7Array
(
[0] => Array
(
[atui] => 38845159274
[aturid] => 86128451138
[atut] => seomoz
)
[1] => Array
(
[atui] => 38845159274
[aturid] => 86128451144
[atut] => seomoz.org
)
[2] => Array
(
[atui] => 38845159274
[aturid] => 86128451131
[atut] => seo
)
)
8Array
(
[0] => Array
(
[atuiu] => 1
)
[1] => Array
(
[atuiu] => 1
)
[2] => Array
(
[atuiu] => 0
)
)
9Array
(
[0] => Array
(
[atuiu] => 1
[aturid] => 86128451138
)
[1] => Array
(
[atuiu] => 1
[aturid] => 86128451144
)
[2] => Array
(
[atuiu] => 0
[aturid] => 86128451131
)
)
10Array
(
[0] => Array
(
[atuiu] => 1
[atut] => seomoz
)
[1] => Array
(
[atuiu] => 1
[atut] => seomoz.org
)
[2] => Array
(
[atuiu] => 0
[atut] => seo
)
)
Links API:
Similar confusion here for:
"TargetCols"
"SourceCols"
"LinkCols"
The description here http://apiwiki.seomoz.org/w/page/13991141/Links API - is a bit vague
It appears that the links API spits out everything anyway - that one's less of an issue.
So... could anyone explain how the Anchor-text API parameter Cols works??
Cheers!
Short of coding up your own web crawler - does anyone know/ have any experience with a good bit of software to run through all the pages on a single domain?
(And potentially on linked domains 1 hop away...)
This could be either server or desktop based.
Useful capabilities would include:
Perhaps an oppourtunity for an additional SEOmoz tool here since they do it already!
Cheers!
Note:
I've had a look at:
Any experience/ preferences with these or others?