Hi Nima
Google will always show more links than our crawler as they are a much larger company and have multiple seeds to crawl more pages. You should use multiple tools to obtain a full link profile which is something I've seen recommended many times in our community. The 68 links we found mean those links meet the criteria we are measuring to discover them. This also means the 158 other links may not be considered as high ranking sites or your links are too deep for us to crawl or they have been crawled and are just waiting to appear in a future index update. No single index will be the same as each company crawls differently.
As I mentioned earlier:
- We grab the most recent index
- We take the top 10 billion URLs with the highest MozRank (with a fixed limit on some of the larger domains).
- We start crawling from the top down until we've crawled 90,000,000,000 pages (which is about 35% the amount in Google's index).
Therefore, if the site is not linked to by one of these seed URLs (or one of the URLs linked to by them in the next update) then it won't show up in our index. Google does not crawl this way. This and our metrics are proprietary to Moz so it isn't that our data is unreliable or is being crawled incorrectly.
The domain authority is effected by a lot of things. It is hard to pin point it without being a SEO consultant or the specific web designer on your website. Domain and Page Authority scores are both calculated using Moz's Ranking Models work. In essence, we take a lot of rankings data from the search engines (by running queries) and then try to build a predictive scoring system using our own on-page analyses and Mozscape link data to construct an algorithm that will effectively reproduce the search engines' results. Our current accuracy hovers in the 70% range, but over time, we expect to improve.
Once we have a ranking model (which we internally call "uber"), we can create scores that best approximate the combinations of all our page-specific link metrics or domain-specific link metrics (removing the keyword-specific features like anchor text, on-page keyword usage, etc). These scores represent the model's query-independent or non-keyword-based ranking inputs.
In simple terms, Domain Authority is our best prediction about how content would perform in search engine rankings on one site vs. another. Page Authority answers the same question for an individual page. Both are amalgamations of all the link metrics (number of links, linking root domains, mozRank, mozTrust, etc.) we have into a single, predictive score.
It's important to note that both Domain Authority and Page Authority are on a 100-point, logarithmic scale. Thus, it's much more difficult to grow your score from 70 to 80 than it would be to grow from 20 to 30.
Here's some places to really delve into what is going on:
http://moz.com/blog/whiteboard-friday-domain-trust-authority
http://moz.com/blog/googles-algorithm-pretty-charts-math-stuff
http://moz.com/blog/whiteboard-friday-domain-authority-page-authority-metrics
Here are some good resources to help you take a look at the factors.
http://moz.com/blog/whiteboard-friday-domain-authority-page-authority-metrics
http://apiwiki.seomoz.org/w/page/20902104/Domain Authority
http://moz.com/blog/whiteboard-friday-domain-trust-authority
I would recommend starting a new thread on the forum to seek advice from other marketers as I am only able to explain how our tools work from the technical side.
Hope this helps!