Mozscape API Updates (Non-updates!) - becoming a joke!
-
This is the 3rd month in succession where the Mozscape index has been delayed. Myself and clients are losing patience with this, as I am sure many others must be.
Just what do you suppose we tell clients waiting for that data? We have incomplete and sometimes skewed metrics to report on, delays which then get delayed further, with nothing but the usual 'we are working on it' and 'bear with us'.
It's becoming obvious you fudged the index update back in January (see discussion here with some kind of explanation finally from Rand: https://moz.com/community/q/is-everybody-seeing-da-pa-drops-after-last-moz-api-update), and seems you have been fumbling around ever since trying to fix it, with data all over the place, shifting DA scores and missing links from campaign data.
Your developers should be working around the clock to fix this, because this is a big part of what you're selling in your service, and as SEO's and marketers we are relying on that data for client retention and satisfaction. Will you refund us all if we should lose clients over this?! .. I don't think so!
With reports already sent out the beginning of the month with incomplete data, I told clients the index would refresh April 10th as informed from the API updates page, only to see it fudged again on day of release with the index being rolled back to previous. So again, I have to tell clients there will be more delays, ...with the uncertainty of IF it WILL EVEN get refreshed when you say it will. It's becoming a joke.. really!
-
Hey Matt - I can get into some of the nitty gritty details on this.
Basically - we've been having trouble of all kinds with Mozscape, and while our team has indeed been working around the clock, the reality is that it's an old, clunky, hard-to-understand system that needs to be replaced entirely. That work is also going on, but as you might imagine, has a separate team on it, which means the Mozscape team's bandwidth is split.
Mozscape has crawling trouble - we've had issues with our own crawler design, specifically with spam that's fooled our crawlers (it's designed to fool Google, obviously, but has caught us, too), and biased our index. We also had an issue where some code was commented out that helped us recrawl important pages and other issues (along with a couple of longtime engineering departures) made that invisible to us for a good few months (even with it fixed, it will take an index or two to get back to normal). We've had other issues with hardware and bandwidth restrictions, with team changes, with unintentionally excluding important sites and important pages on sites due to erroneous changes on our end, with robots.txt interpretation mistakes. You name it. It's been pretty frustrating because it's never a single issue coming up again and again, but rather new issues each time. The team currently on the Mozscape project is relatively new -- we had almost complete turnover on that team in the last year (a combination of voluntary and non), so there's a lot of rampup and trying to understand what things do, and fix old problems, etc. I'm sure as an engineer you're familiar with those types of challenges, especially when the documentation isn't pristine.
IMO - those are crappy excuses. We should be better. We will be better. I don't provide them to pardon our shitty quality the last few months, but rather because you said you wanted detail, and I do love transparency.
I think we're going to have a tough slog until the new index system comes out (likely this Fall). I'm keeping my fingers crossed that we can repair each new problem and that few others arise, but the past 6 months have made me wary of overpromising and under-delivering.
BTW - it is true that the ML model means there's lots of DA flux as the goal is to be as accurate as possible with Google's changes, so if we see a site with certain types of inputs matching patterns of sites that don't rank as well, that DA will drop. Given that Google's rankings fluctuate all the time, that our crawlers fluctuate a lot (more than they should, as noted above), and that the link graph changes constantly, a lot of flux in DA is to be expected. That said, the new model will have DA refreshed daily, rather than monthly, and will also have history, as well as a way to dig in and see what inputs are big in DA and how those have changed. I think all of that will help make these shifts vastly more transparent, even if they continue to be high (which they should so long as Google's own flux is high).
One thing I am working on with the team - a different kind of score, called something like "domain visibility" or "rankings visibility" that tracks how visible a site's pages are in a large set of Google rankings. I think that score might be more what clients are seeking in terms of their overall performance in Google, vs. their performance in the link graph and how their links might be counted/correlated with higher/lower rankings.
-
Hi Lisa, we've also experienced many of the same frustrations Greg mentions, but I mainly wanted to respond to your comments about not being able to compare Domain Authority over time. Given that this metric is positioned as measuring a website's overall ability to rank, it shouldn't be unexpected that people want to see how their score evolves over time. Even in your own software the change in Domain Authority since last update appears as one of the most prominent items on the Dashboard, and you also show a graph charting how it changes over time. My question is: since you clearly understand that customers want to be able to compare how Domain Authority evolves over time on a consistent scale, why not at least attempt to normalize it? I am also a machine learning engineer so the explanation "this is based upon machine learning so it will fluctuate unpredictably" makes no sense to me. You could normalize your inputs based upon certain characteristics of the population, or you could use a representative basket of websites to normalize the outputs. From what I've seen it seems that even just normalizing based on the size of your index hugely improves the consistency. It wouldn't need to be perfect to be a huge improvement over the current situation.
-
thanks for your reply Lisa.
Please understand my point is not to get a discount, refund or cancel my subscription. It's to air my grievances (in a place where it can be understood contextually more than a twitter tweet) and make you aware of how this looks for us as marketers in the middle.
I would far prefer you and the teams there to be able to fix the issues you're having and be on time with the updates, than migrate away from Moz - that's the last thing I want.
I'm aware the other tools and services are functioning fine, but of course with DA and link data missing there's a huge gap in our reporting to clients, and the frequent delays don't do anything for confidence.
I thank you for the detailed reply - which in reality could have been a good way to calm our nerves (with a product support blog post, email update to customers) rather than just relying on the Mozscape API update page saying 'we are having some problems with launching the new index'. That kind of transparency is after all part of the TAGFEE of Moz which has made it a success.
I sincerely hope you can iron-out the problems there, so we can all be confident in the data we are reporting on (and when index refreshes and everything else is scheduled to happen).
Greg
-
Hi Greg,
My name is Lisa and I'm a member of the support team here at Moz.
I want to thank you for getting in touch here and on Twitter to tell us about your concerns and the impact that these failures have on you as a customer and on your clients.
You're right - the last few updates have sucked. They've not been up to the standard that we hope to provide as a company and they haven't been what we want for you or your clients.
And you're right again when you say that our apologies and "we're working on it" messages haven't been enough to give you confidence that we really are doing the best we can to be better.
I apologise for the issue that you've had with your reports and their incomplete data and I wish I could make you some assurances but no decisions have currently been made and I don't want to make you any false promises.
Behind the scenes, the engineering teams have been doing a lot of work to build a better and more reliable crawler, shore up the infrastructure used to store and return the index data and proactively work to spot problems and prevent them from reaching our customers.
Some of this work has caused problems of its own - changing the infrastructure, for instance, made the index fail to upload correctly several times at the end of February - and some of it has not yet been completed.
You mentioned that we 'fudged the index update back in January' and to some extent that is true. We collected a lot of data from sites that were junk and had no value. Since then, we've been reviewing which sites are included in the index and working to strike a balance so that valuable sites are crawled but spam sites and subdomains are blocked. A lot of this work must be done manually.
Another problem we have, with Domain Authority in particular, is that we know there's a mismatch between how it works and how our customers apply it.
Although it is a score between 0 and 100, Domain Authority is a relative metric. It depends on the ranking model data we have (about which types of sites are likely to rank) and the data we've collected in the index, not just for one site but for all of them. A change to the link data we've collected for your site, or for other sites, or a change to the ranking model can dramatically affect the Domain Authority score of any site. This data should not be considered in a vacuum but used comparatively.
Domain Authority scores have always been and will always be expected to fluctuate. The best way to use a site's Domain Authority is to compare it to competitor sites and measure the distance between them. Using it as a standard score out of 100 is likely to cause anger and frustration when it drops, even though drops and rises are both part of the nature of Domain Authority.
Everyone here is invested in making the index better and we all want to make you and your clients happy. We'd like to provide round the clock coverage to solve these problems but this is not possible for us. We have a small engineering team based in Seattle and we use their time as efficiently as possible to allow them to do their best work.
We do feel that our tools have value beyond the link index and Domain Authority data - that's why we offer this for free using MozBar, Open Site Explorer and our API - but I would understand if you feel that the tools are not meeting your needs while there are problems and delays with this data. I'd be happy to help you cancel your subscription and offer a refund on your last payment if this in the case - just reach out to me via help@moz.com.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Unsolved Regarding Moz API token password update
Hi, In March we have updated password for MOZ API and used in our application it worked, but currently the updated password is not working and in the MOZ site the old password is shown and its active. We are using Legacy username and password.
API | | NickAndrews
We see that 5 tokens can be added for API, if we add 2 tokens both will be active.
We are currently using free services. Please help us resolve this issue.0 -
Location Data Batch Updates via the MOZ API
According to the MOZ API documentation, I am able to update multiple Locations in a batch in order to create or update their location data, currently 130 locations. I have successfully created a batch and the API returned the $id, as I expected. Yet, it doesn't make clear how the multiple locations I want to update are supposed to be sent back to the API. I was expecting an upload of a CSV file or JSON data, and not Query Parameter as noted in the docs. When including a JSON file, properly formatted, as a binary upload, the response is still expecting a locations parameter. See error here: { "status":"MISSING_PARAMETER", "message":"locations missing. ", "response": { "locations":"MISSING_PARAMETER" } } https://moz.com/developers/docs/local/api#_api_batch__id-PATCH
API | | yuca.pro1 -
MOZscape API Signup "An unknown error has occurred"
Hello, I am not able to signup for MOZscape API, I am getting error while signing up for MOZScape API under free trial. https://moz.com/checkout/api --> Getting error here, please help. Thanks.
API | | rahul2k11in0 -
API metrics for discovered and lost linking domains
Hello, Could you tell me, can I use the api to get such data as ‘Discovered in the last 60 days’ and ‘Lost Links’?
API | | OlegKireyenka
If yes, which values of ‘Cols’ in URL Metrics I can use? Thank!0 -
Need help understanding API
I know what information I need to pull... I know I need APIs to do it... I just don't know how to pull it or where. I have tools like Screaming Frog, Scrapebox, SEMRush, Moz, Majestic, etc. I need to find out how to type in a query and pull the top 10 ranking specs like DA, PA, Root Domains, Word Count, Trust Flow, etc. Here is a screenshot of info I manually pulled... https://screencast.com/t/H1q5XccR8 (I can't hyperlink it... it's giving me an error) How do I auto pull this info?? H1q5XccR8
API | | LindsayE0 -
How to retrieve keyword difficulty information using Mozscape API?
Hi, Are we possible to use Mozscape API to retrieve keyword difficulty information for a list of keywords? I can't find its documentation. Thanks
API | | uceo0 -
Does any one know if there is a tool out there built with the Moz API where you can feed in a large list of URLs and get back Domain Authority?
Does any one know if there is a tool out there built with the Moz API where you can feed in a large list of URLs and get back Domain Authority? Has anyone used the API for something like this before? Thanks! Max
API | | Porch0