Suggestion - How to improve OSE metrics for DA & PA
-
I am sure everyone is aware at Moz, that although the Moz link metrics ( primarily I am talking about DA & PA) are good, there is a lot of room for improvement, and that there are a lot of areas where the metric values given to some types of site are well out of whack with what their "real" values should be.
Some examples
www.somuch.com (Link Directory) - DA 72
www.articlesbase.com (Article Directory) - DA 89
www.ezinearticles.com (Article Directory) - DA 91I'm sure everyone would agree that links from these domains are not as powerful (if of any value at all), as their DA would suggest, and therefore by definition of how moz metrics work, the sites these have links from such sites are also inflated - thus they throw the whole link graph out of whack.
I have 2 suggestions which could be used to singularly or in conjunction (and obviously with other factors that Moz use to calculate DA and PA) which could help move these values to what they should more realistically be.
1/. Incorporate rank values.
This is effectively using rank values to reverse engine what google (or other engines) as a "value" on a website. This could be achieved (if moz were not to build the data gathering system itself), by intergrating with a company that already provides this data - eg searchmetrics, semrush etc. As an example you would take a domian and pull in some rank values eg http://www.semrush.com/info/somuch.com?db=us - where you could use traffic, traffic price, traffic history as a metric as part of the overall Moz scoring alogrithm. As you can see from my example according to SEMRush the amount of traffic and traffic price is extreamly low for what you would expect of a website that has a DA of 72. Likewise you will find this for the other two sites and similarly to pretty much any other site you will test. This is essentially because your tapping into Googles own ranking factors, and thereby more inline with what real values (according to Google) are with respect to the quality of a website. Therefore if you were to incorporate these values, I believe you could improve the Moz metrics.2/. Social Sharing Value
Another strong indicator of quality the amount of social sharing of a document or website as a whole, and again you will find as with my examples, that pages on these sites have low social metrics in comparison to what you would normally associate with sites of these DA values. Obviously to do this you would need to pull social metrics of all the pages in your link DB. Or if this we to tech intense to achieve, again work with a partner such as searchmetrics, which provide "Total Social Interations" on a domain level basis. Divide this value by the number of Moz crawled pages and you would have a crude value of the overall average social scorability of a webpage on a given site.Obviously both the above, do have their flaws if you looked at them in complete isolation, however in combination they could provide a robust metric to use in any alogrithm, and in combination with current moz values used in the alogrithm I believe you could make big strides into improving overall Moz metrics.
-
I'm not directly involved in the project, but I think that's actually part of what they're doing - using Google de-indexation and obvious penalties to train the system, but trying to avoid a system that would have to go look up the site on Google every time it needed to make a prediction.
-
Cheers Pete.
I totally understand the data dependency. One thing you could do, which would not require data dependency (long term), and also help with the spam detection your building is to take a single snapshot of "Ranking" - then use this as a data set to pattern match spam sites. EG if you managed to pull say 100,000's of ranking scores (say traffic scores from SEMRush), then match that with Moz's current scoring on that domain, then bucket the sites into groups that have higher or lower ranking scores than DA would predict, then try and reverse engineer the link or other patterns Moz use which are common to those buckets.
-
Thanks - happy to pass that along. We're actually in the middle of a long-term spam detection project to help notify people when a site seems to be suspicious or is likely to be penalized by Google. Eventually, this may find its way into DA/PA. We don't want to use ranking and Google's own numbers, as it creates a bit of a problematic data dependency for us (especially long-term).
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Unsolved "The account does not have access to that quota" - API v1 URL Metrics
Hi! On one of our servers we get the message "The account does not have access to that quota" when pulling data for v1 URL Metrics endpoint in the API.
API | | aheitzman
This only happens on one specific server, others have worked flawlessly. Any idea of what might be going on? Thanks0 -
API metrics for discovered and lost linking domains
Hello, Could you tell me, can I use the api to get such data as ‘Discovered in the last 60 days’ and ‘Lost Links’?
API | | OlegKireyenka
If yes, which values of ‘Cols’ in URL Metrics I can use? Thank!0 -
Is Anchor text metric is available for free MOZ API ?
I'm accessing anchor text metric using free MOZ API account which is for 30 days. is there is any way to get anchor text ?
API | | rogerdavid0 -
Navigation pages with a PA 1
Ok, you guys probably think this is a new website and i should just wait, but this is not the case.
API | | Forresult
We have 2 websites (old) websites with a DA of 34 and a DA of 19 and high PA values on the mainpage. Our problem: All the other pages stay at a page authority of 1. One website is build in Magento and one in Wordpress. Both websites have deeplinks, footerlinks en in-contentlinks. The other pages don't get any linkjuice according to Moz. We don't use any robot noindex,nofollow or nofollow links and the menu structure isn't the problem. Is anyone familiare with the problem? I know is shouldn't be concerned about PA/DA, but i just can't explain what's going on.0 -
Why does OSE show old data (Previous update results)?
Moz api started to show July 13 update results for my website. I checked it 2 days ago and saw all new established links and updated DA PA for July 27 update. But last 2 days both Moz Api and OSE main page show July 13 update results. Is there a maintenance or mismatching error between old and new databases?
API | | cozmic0 -
Alternatives to Supermetrics to get Moz metrics into Google Sheets?
Hi, Interested in getting Moz API metrics into Google Sheets on an automated scheduled. Supermetrics can do this, but I am curious to know if there are any alternatives, free or paid. Thanks! 🙂
API | | GOODSIR1 -
Checking DA, MozTrust and MozRank for 390K domains.
Hello Guys, I've a list of of 390K domains which will be available to register until 02/22. Those are expired domains and they are all available at 10.59USD each. I want to check all domains for DA, mT and mR. hahaha 😛 Any suggestions? If any good soul with a API subscription wanna try it, if we find any gems on those domains, I can share, 3 to 1. 3 for the person using the API and 1 for me. Anyway, I'm just trying. 😄 Thanks 🙂
API | | strongmedia0 -
API - Internal Links to page and related metrics
Hi dear moz Team! Currently I´m building a Java application accessing your API. But there are some metrics I urgently need which I can´t get out of the API until now: The total number of internal links to a page The total number of internal links to a page with partial anchor text match MozRank passed by all internal links w. part. match anchor text (would be nice) For example, if I try this by your links endpoint, my idea was: http://lsapi.seomoz.com/linkscape/links/http%3A%2F%2Fwww.jetztspielen.de%2F?AccessID=..
API | | pollierer
&Expires=..
&Signature=..
&Scope=domain_to_page
&Filter=internal
&Sort=domain_authority
&SourceCols=4 (or any other value)
&SourceDomain=www.jetztspielen.de
&Offset=0
&Limit=50 If I try this, the API says: {"status": "400", "error_message": "Cannot set a source domain when filtering for internal links."} Is there any way to get the data I need by your API endpoints? I´m currently writing my master thesis and it is very important to me to solve this somehow. Thank you very much in advance! Best, Andreas Pollierer1