Is everybody seeing DA/PA-drops after last MOZ-api update?
-
Hi all,
I was wondering what happend at the last MOZ-update. The MOZ-health page has no current errors and i've checked a lot of websites probably about 50 a 60 from our customers and from our competitiors and everybody seems to have a decrease in DA's / PA's. Not just a couple went up and down like normally would happen, but all seems to have dropped.... Is anyone seeing the same in their campaigns?
Greetings,
Niels -
I have a client who has dropped from DA16 to DA1 and lost the few links he had he's NOT happy!!
Haven't just spent a few weeks impressing upon him the importance of building up his DA, it's been pretty hard trying to pacify him over this.
Great explanation from Rand, but c'mon MOZ when are you going to fix this?
And when are you going to enable tracking of DA? AuthorityLabs and others can do it for keyword ranking. I'm sure you can do it for DA and PA.
It appears their primary competitor has maintained their DA of 19. So can someone please explain this added twist?
-
Thnx! That explains a lot
-
Thanks for the logical answer, makes a lot of sense.
-
If we found fewer links to your sites with this index update, that can certainly be a factor in why you may have seen your Domain Authority drop a bit. These fluctuations aren't out of the ordinary, for all the reasons Rand mentions above - the number of backlinks in our index, and the quality of those links, will impact your DA scores.
You may see your DA bounce back after our next index update. This isn't scheduled for next week - our indexes take approximately 4 weeks to process. It looks like the next update is scheduled for February 28th. You can check this page for details on the Mozscape Index update any time: https://moz.com/products/api/updates
Hope that helps!
-
Yeah all my own and client sites are down 2-4 points. I'm seeing a decline in backlinks tracked so I thought that may be a factor. It's either a one time change moving forward or perhaps a bug that will correct itself next week? No idea.
-
Hi Niels - yep, I saw a bit of this too. I believe there's two causes:
-
We crawled a larger swath of the web in this index, so we captured more sites and more links, and that may mean the scaling of PA/DA (which are logarithmic) stretches to accommodate the larger number of links found, especially to sites at the top of the scale. For example, if Facebook has a DA of 100 with 5 Billion links, then we find 5 billion more links to it, Facebook still has a DA of 100, but it's a much higher threshold. Thus, sites with fewer links (and less quality links) will fall in DA as the scale is now stretched.
-
We crawled some weird stuff in this index, by mistake (or rather, because spammers built some nasty, deep crawl holes that Google probably didn't fall for but we did). A ton of odd domains on strange ccTLDs were seen, and crawled, because they gamed PageRank with lots of sketchy links. We've now excluded these for indices going forward, and hopefully will see the impact abate.
All that said, over time, as our index grows, you can expect that raw DA/PA numbers could get harder to achieve, meaning a lot of sites will drop in PA/DA (and some will grow too, as we discover more links to them in the broader web). My best advice is always to not use PA/DA as absolutes, but rather relative scores. That's how they're designed and how they work best.
It's like back when Google had PageRank, and Moz.com grew from PR4 to PR7, then as Google got bigger and bigger, and the web got bigger, Moz.com fell to PR5, even though we had way more links and ranked for way more stuff. The raw PR scale had just become stretched, so our PageRank fell, even though we'd been improving.
-
-
I second that.
-
Are you guys seriously using templates to answer questions about a lower DA. -.- ? I've got the exact same answer in the chat...-.- Atleast you could admit something went wrong with the last API-update.... As i'm counting the people here... more then 13 seo-consultants who pay a lot of money per month for accurate tracking all dropped down with a lot of websites.
Normally i wouldn't bother with a dropdown and it will probably be fixed in the next update but some customers even use it as a KPI and we gotta explain them we worked hard but they dropped with 10 points. -.- So answering something like:
"The reasoning behind this can be hard to pinpoint without the help of an SEO consultant or the specific web designer for your website".
Ain't gonna cut it for me..-.- What do you think we are? Atleast give us something we can communicate to our customers. -.- What i see in the API is a loss of:
- Januari 2017 » 1,100,261,648,691 (1.1 trillion) links.
- December 2016 » 1,114,899,547,461 (1.1 trillion) links.
So i could see it as a loss of 14 billion links since the last update? And because those links where lost for all high DA-websites everything dropped down. That would be the answer i aspected.
No hard feelings, but just be transparant with us, so we can communicate it to our customers. -
You're not wrong! Seeing a drop in links is likely to mean that our crawlers just didn't pick up those URLs this time around and include them in our index, but in all likelihood those links that we previously reported are still out there on the web. Our tools just haven't re-visited them to add them back into the index.
-
I also noticed that the number of External Links plummeted. Ours dropped from ~80 to ~45. Our biggest competitor lost about 150 of their 400.
Correct me if I'm wrong, but isn't this more likely to represent the Moz crawler not crawling certain URLs, rather than an actual loss of those backlinks?
-
Hi there! Tawny from Moz's Help Team here.
Aw, man! Sorry to hear about your DA dropping. You're right that it appears to have happened for everyone, across the board. I think I can help explain. The reasoning behind this can be hard to pinpoint without the help of an SEO consultant or the specific web designer for your website. Domain and Page Authority scores are both calculated using Moz's Ranking Models. In essence, we take a lot of rankings data from the search engines (by running queries) and then try to build a predictive scoring system using our own on-page analyses and Mozscape link data to construct an algorithm that will effectively reproduce the search engines' results. Our current accuracy hovers in the 70% range, but over time, we expect to improve.
Once we have a ranking model (which we internally call "uber"), we can create scores that best approximate the combinations of all our page-specific link metrics or domain-specific link metrics (removing the keyword-specific features like anchor text, on-page keyword usage, etc). These scores represent the model's query-independent or non-keyword-based ranking inputs.
In simple terms, Domain Authority is our best prediction about how content would perform in search engine rankings on one site vs. another. Page Authority answers the same question for an individual page. Both are amalgamations of all the link metrics (number of links, linking root domains, mozRank, mozTrust, etc.) we have into a single, predictive score.
It's important to note that both Domain Authority and Page Authority are on a 100-point, logarithmic scale. Thus, it's much more difficult to grow your score from 70 to 80 than it would be to grow from 20 to 30.
It is also important to note that if the higher authority sites that your site is seeded from drop in score, it will ripple down to all of the other links branched off from the very top. A good place to start is to compare the DA from the linking root domains from the previous index. Also take a look at your competitors scores to see how much they have dropped as well. As our indexes grow in size, there are more links that are included in our calculations. I think this is a big part of why we saw drops in Domain and Page Authority this time around - more links included in our calculations will skew the scores a bit lower.
We recommend keeping track of how many linking root domains you have from index to index, as this will be a quick way to confirm possible reasons for an increase/decrease in the score. For an in-depth discussion from Rand, check out this article: https://moz.com/community/q/da-pa-fluctuations-how-to-interpret-apply-understand-these-ml-based-scores I've found this article to be most helpful when I'm trying to determine why Domain or Page Authority scores may have fluctuated so much, especially when it's across the board like this.
Here's some places to really delve into what is going on:
http://moz.com/blog/whiteboard-friday-domain-trust-authority
http://moz.com/blog/googles-algorithm-pretty-charts-math-stuff
http://moz.com/blog/whiteboard-friday-domain-authority-page-authority-metricsHere are some good resources to help you take a look at the factors.
http://moz.com/blog/whiteboard-friday-domain-authority-page-authority-metrics
http://apiwiki.seomoz.org/w/page/20902104/Domain%20Authority
http://moz.com/blog/whiteboard-friday-domain-trust-authorityI hope this helps!
-
I'm also curious about any news/updates regarding this topic. So is there any news?
We suddenly dropped in DA from 40 > 34 (!) and our competitors lost a little less, from 3 to 5 DA points. We also lost like 185 external backlinks and so did our competitors.
Rankings are the same.
-
Any news around this topic yet?
We also dropped in Da from 29>27 and another of our sites even from 21>17 (both https). All of our competitors also dropped in DA (one 30>22).
No drops in rankings fortunately.
-
Dropped 3 points, so did all of our competitors. The number of links seemed to have taken a dive for all sites I follow.
Is anyone from Moz looking into this?
-
Yep. All HTTP. Drops of around 3 - 4 points across the board. It's almost reassuring that we're not alone.
-
Haha
-
bounces theory of chest on to his knee, then volleys into the back of the non-existent net, and celebrates with Gillon the fact they have ruled something out
-
Yeah, I can't see a pattern either. Most domains I track are still http including ours but the https ones I track have seen drops as well (4 out of 5, 1 has stayed the same).
-
Throws theory out of window
-
Our 10 point drop is on a https site
-
Are the websites you're all seeing drops on http or https?
The reason I ask is that I've seen a small drop on some http sites and a small rise on https sites. Wondering if there is a correlation?
-
Yep, the biggest drop we have seen is from 10 to 1 on a small website we haven't worked on in months. Mostly smaller drops but bigger than other months. I monitor some competitors as well and out of 24, 2 have stayed the same, 1 has improved, all others have dropped. Our websites have all dropped except for one (which has stayed the same).
Glad to know we're not the only ones!
-
Well, on the good side, there is only one way to go...
-
Yes, a new website we haven't worked on yet has dropped from 14 to 1. -.-
-
Thankfully not, only between 1 and 3.
-
Anyone got more than 10!???
-
Yep yep. I'm seeing drops across the board, for everyone!
-
Hi,
Yes i have just asked something similar. One client has seen a 10 point drop.Not an easy one to explain...
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
I can not get the results from API now.
Dear master,
API | | Feipo
I can not get the results from API now. And also yesterday. My id is 1393657@qq.com.2 -
Unknown data items in Links API response
So I'm on the documentation page at https://moz.com/help/links-api/making-calls/url-metrics . I've copied the bit fields table into a spreadsheet. I've removed all the "deprecated" items and have totalled up the remaining items making a Cols= value of 182537205900657000. For want of better knowledge, I've used that same value for LinkCols=, SourceCols= and TargetCols=. The request url ends up being (anonymized): http://lsapi.seomoz.com/linkscape/links/fordanddoonan.com.au?Cols=182537205900657000&AccessID=xxx&Expires=1557458487346&Signature=xxx&SourceCols=182537205900657000&TargetCols=182537205900657000&LinkCols=182537205900657000&Limit=1000&Scope=page_to_domain&Filter=external My question is, what do I make of all the extra fields that I get in the json packet what should I do to remove them? These extra fields are: lrid, lsrc, ltgt, luufq, luueid, lufeid, luujid, luumrp, luumrr, luflan, lufspf, lufsplc, lufspp, lufsps, lufspsc, luus, lufuid, lupuid, lufipl, luupa, lupda, luued, lufed, luped, lupib, luulc, flan, fspf, fsplc, fspp, fsps Is it simply a case of limiting the values of TargetCols, LinkCols and SourceCols? If so, to what?
API | | StevePoul1 -
Navigation pages with a PA 1
Ok, you guys probably think this is a new website and i should just wait, but this is not the case.
API | | Forresult
We have 2 websites (old) websites with a DA of 34 and a DA of 19 and high PA values on the mainpage. Our problem: All the other pages stay at a page authority of 1. One website is build in Magento and one in Wordpress. Both websites have deeplinks, footerlinks en in-contentlinks. The other pages don't get any linkjuice according to Moz. We don't use any robot noindex,nofollow or nofollow links and the menu structure isn't the problem. Is anyone familiare with the problem? I know is shouldn't be concerned about PA/DA, but i just can't explain what's going on.0 -
Old moz pixel?
We recently switched our website to secure https and noticed there is a pixel still firing that is related to moz but have no idea what it is for. https://moz.com/products/Mozscape/api/keymaster/Mozscape/credentials It is serving a 401 error status and going to remove off the site. Any idea what that was for? Thanks -
API | | APMSEOPRO
Nate0 -
March 2nd Mozscape Index Update is Live!
We are excited to announce that our March 2<sup>nd</sup> Index Update is complete and it is looking great! We grew the number of subdomains and root domains indexed, and our correlations are looking solid across the board. Run, don’t walk, to your nearest computer and check out the sweet new data! Here is a look at the finer details: 141,626,596,068 (141 billion) URLs 1,685,594,701 (1 billion) subdomains 193,444,117 (193 million) root domains 1,124,641,982,250 (1.1 Trillion) links Followed vs nofollowed links 3.09% of all links found were nofollowed 62.41% of nofollowed links are internal 37.59% are external Rel canonical: 27.46% of all pages employ the rel=canonical tag The average page has 92 links on it 74 internal links on average 18 external links on average Thanks again! PS - For any questions about DA/PA fluctuations (or non-fluctuations) check out this Q&A thread from Rand:https://moz.com/community/q/da-pa-fluctuations-how-to-interpret-apply-understand-these-ml-based-scores
API | | IanWatson7 -
Domain Authority drop due to Moz update?
Hi guys, I have noticed an unusual Domain Authority (DA) drop in mid November for many different websites here in Sweden. There was a Moz update the 12th of November which I assume has to do with this drop. I have attached the DA history chart to visualize the drop. Does someone have any information about: Did the drop in DA have to do with the update the 12<sup>th</sup>? Did the new update have a new type of metric calculation? Is anyone else experiencing this drop? Any extra details would be helpful. Thanks a lot guys. YLhKVwF.png
API | | OscarSE0 -
September's Mozscape Update Broke; We're Building a New Index
Hey gang, I hate to write to you all again with more bad news, but such is life. Our big data team produced an index this week but, upon analysis, found that our crawlers had encountered a massive number of non-200 URLs, which meant this index was not only smaller, but also weirdly biased. PA and DA scores were way off, coverage of the right URLs went haywire, and our metrics that we use to gauge quality told us this index simply was not good enough to launch. Thus, we're in the process of rebuilding an index as fast as possible, but this takes, at minimum 19-20 days, and may take as long as 30 days. This sucks. There's no excuse. We need to do better and we owe all of you and all of the folks who use Mozscape better, more reliable updates. I'm embarassed and so is the team. We all want to deliver the best product, but continue to find problems we didn't account for, and have to go back and build systems in our software to look for them. In the spirit of transparency (not as an excuse), the problem appears to be a large number of new subdomains that found their way into our crawlers and exposed us to issues fetching robots.txt files that timed out and stalled our crawlers. In addition, some new portions of the link graph we crawled exposed us to websites/pages that we need to find ways to exclude, as these abuse our metrics for prioritizing crawls (aka PageRank, much like Google, but they're obviously much more sophisticated and experienced with this) and bias us to junky stuff which keeps us from getting to the good stuff we need. We have dozens of ideas to fix this, and we've managed to fix problems like this in the past (prior issues like .cn domains overwhelming our index, link wheels and webspam holes, etc plagued us and have been addressed, but every couple indices it seems we face a new challenge like this). Our biggest issue is one of monitoring and processing times. We don't see what's in a web index until it's finished processing, which means we don't know if we're building a good index until it's done. It's a lot of work to re-build the processing system so there can be visibility at checkpoints, but that appears to be necessary right now. Unfortunately, it takes time away from building the new, realtime version of our index (which is what we really want to finish and launch!). Such is the frustration of trying to tweak an old system while simultaneously working on a new, better one. Tradeoffs have to be made. For now, we're prioritizing fixing the old Mozscape system, getting a new index out as soon as possible, and then working to improve visibility and our crawl rules. I'm happy to answer any and all questions, and you have my deep, regretful apologies for once again letting you down. We will continue to do everything in our power to improve and fix these ongoing problems.
API | | randfish11 -
Is there a reports Api?
hello fellow Mozers. got a question. is anyone aware of an api coming out to feed individual reports? We have a crm to manage all of our clients, and I'd love to feed the new moz analytics directly into our system. Please let me know if it already exists, is in the works, etc. Thanks, Stephan
API | | Stephan_Boehringer0