Is everybody seeing DA/PA-drops after last MOZ-api update?
-
Hi all,
I was wondering what happend at the last MOZ-update. The MOZ-health page has no current errors and i've checked a lot of websites probably about 50 a 60 from our customers and from our competitiors and everybody seems to have a decrease in DA's / PA's. Not just a couple went up and down like normally would happen, but all seems to have dropped.... Is anyone seeing the same in their campaigns?
Greetings,
Niels -
I have a client who has dropped from DA16 to DA1 and lost the few links he had he's NOT happy!!
Haven't just spent a few weeks impressing upon him the importance of building up his DA, it's been pretty hard trying to pacify him over this.
Great explanation from Rand, but c'mon MOZ when are you going to fix this?
And when are you going to enable tracking of DA? AuthorityLabs and others can do it for keyword ranking. I'm sure you can do it for DA and PA.
It appears their primary competitor has maintained their DA of 19. So can someone please explain this added twist?
-
Thnx! That explains a lot
-
Thanks for the logical answer, makes a lot of sense.
-
If we found fewer links to your sites with this index update, that can certainly be a factor in why you may have seen your Domain Authority drop a bit. These fluctuations aren't out of the ordinary, for all the reasons Rand mentions above - the number of backlinks in our index, and the quality of those links, will impact your DA scores.
You may see your DA bounce back after our next index update. This isn't scheduled for next week - our indexes take approximately 4 weeks to process. It looks like the next update is scheduled for February 28th. You can check this page for details on the Mozscape Index update any time: https://moz.com/products/api/updates
Hope that helps!
-
Yeah all my own and client sites are down 2-4 points. I'm seeing a decline in backlinks tracked so I thought that may be a factor. It's either a one time change moving forward or perhaps a bug that will correct itself next week? No idea.
-
Hi Niels - yep, I saw a bit of this too. I believe there's two causes:
-
We crawled a larger swath of the web in this index, so we captured more sites and more links, and that may mean the scaling of PA/DA (which are logarithmic) stretches to accommodate the larger number of links found, especially to sites at the top of the scale. For example, if Facebook has a DA of 100 with 5 Billion links, then we find 5 billion more links to it, Facebook still has a DA of 100, but it's a much higher threshold. Thus, sites with fewer links (and less quality links) will fall in DA as the scale is now stretched.
-
We crawled some weird stuff in this index, by mistake (or rather, because spammers built some nasty, deep crawl holes that Google probably didn't fall for but we did). A ton of odd domains on strange ccTLDs were seen, and crawled, because they gamed PageRank with lots of sketchy links. We've now excluded these for indices going forward, and hopefully will see the impact abate.
All that said, over time, as our index grows, you can expect that raw DA/PA numbers could get harder to achieve, meaning a lot of sites will drop in PA/DA (and some will grow too, as we discover more links to them in the broader web). My best advice is always to not use PA/DA as absolutes, but rather relative scores. That's how they're designed and how they work best.
It's like back when Google had PageRank, and Moz.com grew from PR4 to PR7, then as Google got bigger and bigger, and the web got bigger, Moz.com fell to PR5, even though we had way more links and ranked for way more stuff. The raw PR scale had just become stretched, so our PageRank fell, even though we'd been improving.
-
-
I second that.
-
Are you guys seriously using templates to answer questions about a lower DA. -.- ? I've got the exact same answer in the chat...-.- Atleast you could admit something went wrong with the last API-update.... As i'm counting the people here... more then 13 seo-consultants who pay a lot of money per month for accurate tracking all dropped down with a lot of websites.
Normally i wouldn't bother with a dropdown and it will probably be fixed in the next update but some customers even use it as a KPI and we gotta explain them we worked hard but they dropped with 10 points. -.- So answering something like:
"The reasoning behind this can be hard to pinpoint without the help of an SEO consultant or the specific web designer for your website".
Ain't gonna cut it for me..-.- What do you think we are? Atleast give us something we can communicate to our customers. -.- What i see in the API is a loss of:
- Januari 2017 » 1,100,261,648,691 (1.1 trillion) links.
- December 2016 » 1,114,899,547,461 (1.1 trillion) links.
So i could see it as a loss of 14 billion links since the last update? And because those links where lost for all high DA-websites everything dropped down. That would be the answer i aspected.
No hard feelings, but just be transparant with us, so we can communicate it to our customers. -
You're not wrong! Seeing a drop in links is likely to mean that our crawlers just didn't pick up those URLs this time around and include them in our index, but in all likelihood those links that we previously reported are still out there on the web. Our tools just haven't re-visited them to add them back into the index.
-
I also noticed that the number of External Links plummeted. Ours dropped from ~80 to ~45. Our biggest competitor lost about 150 of their 400.
Correct me if I'm wrong, but isn't this more likely to represent the Moz crawler not crawling certain URLs, rather than an actual loss of those backlinks?
-
Hi there! Tawny from Moz's Help Team here.
Aw, man! Sorry to hear about your DA dropping. You're right that it appears to have happened for everyone, across the board. I think I can help explain. The reasoning behind this can be hard to pinpoint without the help of an SEO consultant or the specific web designer for your website. Domain and Page Authority scores are both calculated using Moz's Ranking Models. In essence, we take a lot of rankings data from the search engines (by running queries) and then try to build a predictive scoring system using our own on-page analyses and Mozscape link data to construct an algorithm that will effectively reproduce the search engines' results. Our current accuracy hovers in the 70% range, but over time, we expect to improve.
Once we have a ranking model (which we internally call "uber"), we can create scores that best approximate the combinations of all our page-specific link metrics or domain-specific link metrics (removing the keyword-specific features like anchor text, on-page keyword usage, etc). These scores represent the model's query-independent or non-keyword-based ranking inputs.
In simple terms, Domain Authority is our best prediction about how content would perform in search engine rankings on one site vs. another. Page Authority answers the same question for an individual page. Both are amalgamations of all the link metrics (number of links, linking root domains, mozRank, mozTrust, etc.) we have into a single, predictive score.
It's important to note that both Domain Authority and Page Authority are on a 100-point, logarithmic scale. Thus, it's much more difficult to grow your score from 70 to 80 than it would be to grow from 20 to 30.
It is also important to note that if the higher authority sites that your site is seeded from drop in score, it will ripple down to all of the other links branched off from the very top. A good place to start is to compare the DA from the linking root domains from the previous index. Also take a look at your competitors scores to see how much they have dropped as well. As our indexes grow in size, there are more links that are included in our calculations. I think this is a big part of why we saw drops in Domain and Page Authority this time around - more links included in our calculations will skew the scores a bit lower.
We recommend keeping track of how many linking root domains you have from index to index, as this will be a quick way to confirm possible reasons for an increase/decrease in the score. For an in-depth discussion from Rand, check out this article: https://moz.com/community/q/da-pa-fluctuations-how-to-interpret-apply-understand-these-ml-based-scores I've found this article to be most helpful when I'm trying to determine why Domain or Page Authority scores may have fluctuated so much, especially when it's across the board like this.
Here's some places to really delve into what is going on:
http://moz.com/blog/whiteboard-friday-domain-trust-authority
http://moz.com/blog/googles-algorithm-pretty-charts-math-stuff
http://moz.com/blog/whiteboard-friday-domain-authority-page-authority-metricsHere are some good resources to help you take a look at the factors.
http://moz.com/blog/whiteboard-friday-domain-authority-page-authority-metrics
http://apiwiki.seomoz.org/w/page/20902104/Domain%20Authority
http://moz.com/blog/whiteboard-friday-domain-trust-authorityI hope this helps!
-
I'm also curious about any news/updates regarding this topic. So is there any news?
We suddenly dropped in DA from 40 > 34 (!) and our competitors lost a little less, from 3 to 5 DA points. We also lost like 185 external backlinks and so did our competitors.
Rankings are the same.
-
Any news around this topic yet?
We also dropped in Da from 29>27 and another of our sites even from 21>17 (both https). All of our competitors also dropped in DA (one 30>22).
No drops in rankings fortunately.
-
Dropped 3 points, so did all of our competitors. The number of links seemed to have taken a dive for all sites I follow.
Is anyone from Moz looking into this?
-
Yep. All HTTP. Drops of around 3 - 4 points across the board. It's almost reassuring that we're not alone.
-
Haha
-
bounces theory of chest on to his knee, then volleys into the back of the non-existent net, and celebrates with Gillon the fact they have ruled something out
-
Yeah, I can't see a pattern either. Most domains I track are still http including ours but the https ones I track have seen drops as well (4 out of 5, 1 has stayed the same).
-
Throws theory out of window
-
Our 10 point drop is on a https site
-
Are the websites you're all seeing drops on http or https?
The reason I ask is that I've seen a small drop on some http sites and a small rise on https sites. Wondering if there is a correlation?
-
Yep, the biggest drop we have seen is from 10 to 1 on a small website we haven't worked on in months. Mostly smaller drops but bigger than other months. I monitor some competitors as well and out of 24, 2 have stayed the same, 1 has improved, all others have dropped. Our websites have all dropped except for one (which has stayed the same).
Glad to know we're not the only ones!
-
Well, on the good side, there is only one way to go...
-
Yes, a new website we haven't worked on yet has dropped from 14 to 1. -.-
-
Thankfully not, only between 1 and 3.
-
Anyone got more than 10!???
-
Yep yep. I'm seeing drops across the board, for everyone!
-
Hi,
Yes i have just asked something similar. One client has seen a 10 point drop.Not an easy one to explain...
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
API v1 still gets new data?
Hello! Do the v1 API endpoints provide fresh data, or do I need to use the v2 endpoints for fresh data? According to the v1 API docs "This guide outlines the endpoints for now archived Mozscape API endpoints." Does this mean that the v1 API only serves archived data? Thanks!
API | | peterkovacs0 -
Is Anchor text metric is available for free MOZ API ?
I'm accessing anchor text metric using free MOZ API account which is for 30 days. is there is any way to get anchor text ?
API | | rogerdavid0 -
Moz rank tracker API
Hi we trying to make a PHP script to get values of keyword rank for a certain country and URL like we do in Rank Traker under Moz Pro tools. Is there any reference i can use to build one myself.
API | | Moreleads0 -
The April Index Update is Here!
Don’t adjust your monitors, or think this is an elaborate April Fool’s joke, we are actually releasing our April Index Update EARLY! We had planned to release our April Index Update on the 6th, but processing went incredibly smoothly and left us the ability to get it up today. Let’s dig into the details of the April Index Release: 138,919,156,028 (139 billion) URLs. 746,834,537 (747 million) subdomains. 190,170,132 (190 million) root domains. 1,116,945,451,603 (1.1 Trillion) links. Followed vs nofollowed links 3.02% of all links found were nofollowed 61.79% of nofollowed links are internal 38.21% are external Rel canonical: 28.14% of all pages employ the rel=canonical tag The average page has 90 links on it 73 internal links on average. 17 external links on average. Don’t let me hold you up, go dive into the data! PS - For any questions about DA/PA fluctuations (or non-fluctuations) check out this Q&A thread from Rand:https://moz.com/community/q/da-pa-fluctuations-how-to-interpret-apply-understand-these-ml-based-scores
API | | IanWatson9 -
Mozscape Index update frequency problems?
I'm new to Moz, only a member for a couple months now. But I already rely heavily on the mozscape index data for link building, as I'm sure many people do. I've been waiting for the latest update (due today after delay), but am not seeing any mention of the data yet - does it normally get added later in the day? I'm not that impatient that I can't wait until later today or tomorrow for this index update, but what I am curious about is whether Moz is struggling to keep up, and if updates will continue to get more and more rare? For example, in 2013 I count 28 index updates. In 2014 that number dropped to 14 updates (50% drop). In 2015, there was only 8 (another 43% drop), and so far this year (until the March 2nd update is posted) there has only been 1. This isn't just a complaint about updates, I'm hoping to get input from some of the more experienced Moz customers to better understand (with the exception of the catastrophic drive failure) the challenges that Moz is facing and what the future may hold for update frequency.
API | | kevin.kembel1 -
Oct 14 2015 MOZScape update: none of DA are changed?
Hello, everybody. Today I noticed that finally the latest release of mozscape update has been posted. Now, I noticed that NONE of 25 campaigns websites NOR any of their competitors DA has changed. I do understand that DA can stay the same, but 25+25*3=100 websites domain authorities hasn't changed at all since August 4th (the date of previous mozscape release)? Or is this happening only to me? Please advise.
API | | DmitriiK2 -
September's Mozscape Update Broke; We're Building a New Index
Hey gang, I hate to write to you all again with more bad news, but such is life. Our big data team produced an index this week but, upon analysis, found that our crawlers had encountered a massive number of non-200 URLs, which meant this index was not only smaller, but also weirdly biased. PA and DA scores were way off, coverage of the right URLs went haywire, and our metrics that we use to gauge quality told us this index simply was not good enough to launch. Thus, we're in the process of rebuilding an index as fast as possible, but this takes, at minimum 19-20 days, and may take as long as 30 days. This sucks. There's no excuse. We need to do better and we owe all of you and all of the folks who use Mozscape better, more reliable updates. I'm embarassed and so is the team. We all want to deliver the best product, but continue to find problems we didn't account for, and have to go back and build systems in our software to look for them. In the spirit of transparency (not as an excuse), the problem appears to be a large number of new subdomains that found their way into our crawlers and exposed us to issues fetching robots.txt files that timed out and stalled our crawlers. In addition, some new portions of the link graph we crawled exposed us to websites/pages that we need to find ways to exclude, as these abuse our metrics for prioritizing crawls (aka PageRank, much like Google, but they're obviously much more sophisticated and experienced with this) and bias us to junky stuff which keeps us from getting to the good stuff we need. We have dozens of ideas to fix this, and we've managed to fix problems like this in the past (prior issues like .cn domains overwhelming our index, link wheels and webspam holes, etc plagued us and have been addressed, but every couple indices it seems we face a new challenge like this). Our biggest issue is one of monitoring and processing times. We don't see what's in a web index until it's finished processing, which means we don't know if we're building a good index until it's done. It's a lot of work to re-build the processing system so there can be visibility at checkpoints, but that appears to be necessary right now. Unfortunately, it takes time away from building the new, realtime version of our index (which is what we really want to finish and launch!). Such is the frustration of trying to tweak an old system while simultaneously working on a new, better one. Tradeoffs have to be made. For now, we're prioritizing fixing the old Mozscape system, getting a new index out as soon as possible, and then working to improve visibility and our crawl rules. I'm happy to answer any and all questions, and you have my deep, regretful apologies for once again letting you down. We will continue to do everything in our power to improve and fix these ongoing problems.
API | | randfish11 -
Does Moz's crawlers use _escaped_fragment_ to inspect pages on a single-page application?
I just got started, but got a 902 error code on some pages, with a message saying there might be an outage on my site. That's certainly not the case, so I'm wondering if the crawlers actually respect and use the escaped_fragment query parameter. Thanks, David.
API | | CareerDean0