Why does Linkscap API request hang while extracting data ?
-
Hi,
I am using LinkScape API to get follow and nofollow links .
I use cron to get data for each url of sitemap.xml.
However while cron is running, the extraction of data hangs on some pages which i later need to delete manually for re starting the execution.
Do anyone have any idea why this is happening ? How can i ignore such pages ?
-
Hi...
Thank you so much for your reply..
I was implementing the changeswhich may work.. but they didn't.
I recently came to know about updates to seomoz api which are like,
- SEOmoz Free API: 1 request for every 10 seconds
- Free Extended and PRO Members: 1 request every 5 seconds
- SEOmoz Site Intelligence API: 1 request every 2 seconds
I think this will help me a lot .. thank you ..
-
Hey Ravi,
Sorry for the delayed response - I wanted to follow up with the engineers to see if they had any suggestions for you.
They agreed the Limit parameter set to 1,000 might be too large to process. Have you tried adjusting that to 300 or even 500? Do you see better success at a lower limit?
Our system will timeout at about 60 seconds so I'm not sure if the hanging is on our end. If dropping the limit size doesn't help, you might want to think about ending the request after about a minute. Sometimes requests that are too long wil timeout, but work fine on a retry as some data will be cached from the previous request.
I hope this information is helpful, but let me know if you're still experiencing issues!
Thanks,
Carin -
Some pages are the pages which are moved or not found. I am trying to ignore such pages by getting http response code. But this process takes large amount of time. Almost 10 minutes for a single url of sitemap.xml.
The url of call is like : (php)
http://lsapi.seomoz.com/linkscape/links/URL HERE?SourceCols=4&TargetCols=4&Scope=page_to_page&Sort=page_authority&Limit=1000&Filter=follow&AccessID=" . $accessID . "&Expires=" . $expires . "&Signature=" . $urlSafeSignature;
Thanks .
-
Hey! This is an issue I haven't heard of before - would you be able to provide anymore information like an example query to the API and some of the pages you are seeing hang?
Thanks!
Carin
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How often does Moz data update when leveraging it through Supermetrics for Datastudios?
Good Afternoon, A pretty straightforward question: How often does Moz data update when leveraging it through Super-metrics for Data studios? I noticed that the "time last crawled" metric had some URLs not crawled in over 2 months... wondering if there's a way to speed it up on Moz's end.
Moz Pro | | LoweProfero0 -
Receiving incorrect data for eFurnitureHouse.com recently migrated from Network Solutions to Big Commerce
We have recently migrated www.eFurnitureHouse.com from Network Solutions to Big Commerce. The Crawl data we are receiving from MOZ is stating (missing meta description) for product categories. However, the meta description is there. On WMT ( HTML improvements) we are receiving 1000s of (Pages with duplicate meta descriptions). However, when you click on older URL it correctly redirect to new. On WMT ( HTML improvements) we are also receiving 1000s of ( pages with duplicate title tags) We would greatly appreciate any help here. Regards, Tony
Moz Pro | | OCFurniture0 -
Moz Data Issues?
Since the launch of Moz something or other has been wrong with my data. Is everyone having these issues? Or is it just me?
Moz Pro | | EcommerceSite0 -
Wher is the API id and key? I tried generating it several times but I get nothing...?
Wher is the API id and key? I tried generating it several times but I get nothing...?
Moz Pro | | breezego0 -
API Limit
I have some problems with API Access. All time i send more than 10 requests per second i got HttpError() {\n "status" : "503",\n "error_message" : "This request exceeds the limit allowed by your current plan. To increase your request limit, see: http://www.seomoz.org/api/pricing"\n}. My Plan is Low Volume and continually got this 503 error. How to solve this?
Moz Pro | | Predicta0 -
You've recently updated your brand rules. We're fetching your new data, and we should have it ready for you within the hour.
Why do i always see this message when entering a certain campaign? "You've recently updated your brand rules. We're fetching your new data, and we should have it ready for you within the hour." I didnt change a thing since i started this campaign two-three weeks ago ...
Moz Pro | | alsvik0 -
What does the "Internal Links" data in the Keyword Difficulty SERP report represent?
What does the "Internal Links" data in the Keyword Difficulty SERP report represent? Thank you! QocBS
Moz Pro | | richpalpine0 -
Changing the Timeframe of Historical Crawl Data
Hello, Just read a great post about the implications of duplicate content for sites after the most recent Panda update: http://www.seomoz.org/blog/duplicate-content-in-a-post-panda-world?utm_source=feedburner&utm_medium=feed&utm_campaign=Feed%3A+seomoz+(SEOmoz+Daily+Blog) In the post is an image or crawl data history that shows months, not days or weeks, worth of trending data as it relates to duplicate content. So my question is this: How do I change my view/date range on my own campaigns so that I can view the trailing months of data rather than only what seems to be the past 4 weeks or so? This would really help me identify the impact of some on page changes we've recently made for a client. Many Thanks, Jared
Moz Pro | | surjm0