Why does Open site explorer only show a fraction of the linked Domains that Google does?
-
One of my sites ( http://www.georgerossphotography.com ) shows 19 root domains and 155 links on Open Site Explorer. This doesn't pass the sniff test with me and when I look at Google Webmasters they show 120 domains with 1,870 links.
I am assuming that Google use it's own data for ranking purposes which makes me question the validity of Open Site Explorer?
Am I missing something ? I have been using OSE as my primary tool and now I feel that it has little value.
I would appreciate any feedback.
George.
-
Hi George - gotcha. That makes sense. The explanation is really simple - we crawl as much as we're able to process (processing and calculating metrics like PA/DA/MozRank/MozTrust/etc are our bandwidth barreirs right now). Other services don't calculate those metrics, but Google & Bing have found ways to do both (large crawls and immediate processing at scale). We're moving to a model similar to Google soon, but in the meantime, our index is smaller than others (particularly in certain sections/parts of the deep web, less trustworthy sites, and more spammy areas of the web).
I will nudge the team to get up a page on Mozscape that helps to better explain this.
-
First of all, thank you for taking the time to respond.
I have been actively linkbuilding this past couple of weeks driving my number of links up from 73 to 427 per Hrefs. During this time I have been monitoring all major link tools and was struggling to fully understand the differences and at the end of the day all that really matters is my SERP position.
In this time, I have been using Linkstant and again when I compare that to Just discovered only a fraction of the new links appear in 'Just Discovered'. Again, I am not trying to be negative I am trying to understand how to best use the tools that I have available to me and how to answer the question that i have raised, should a client ask me the same question.
It is not that the MOZ index is not for my needs is that there is no clear explanation on the OSE page with respect to relationship of the data it returns and that of the other similar services. I could demonstrate to a client that they have more links if they use another link analysis service! That is what is frustrating me. So, if you could provide an explanation of how OSE compares to the other indices that would give me something to work with.
Again, thank you for your time,
George.
-
Also I would suggest using all numerous bit of software.
I use ahrefs, Moz, Webmaster tools and about to sign up for Majestic.
They are all good in their own way and have some great resources (Ahrefs now tells you a list of all your links pointing to 404 pages - great report), but they also have their limitations - unfortunately there isn't one software that does everything amazing so you do need to sign up for multiple resources.
-
I agree its not a great topic for a WBF but a page on the website would be great.
Look forward to meeting you at SearchLove London.
Thanks
Andy
-
Hi Andy - I left a reply below. I could do a WB Friday on this, but the topic doesn't lend itself well to the more tactical foci of WB Friday, and it's a bit Moz centric. That said, I will nudge the team to create a page about how Mozscape's index functions and why counts are different from index to index and across different tools.
Thanks for helping out!
-
Hi George - this is a complex problem/issue, but I'll do my best to explain.
First off, Oleg & Andy are correct - Google has 100s of times the infrastructure (and thousands of times the financial and human resources) that Moz does, and their index is quite a bit larger. For example, the latest Mozscape index is ~180 Billion pages, while Google's main index likely contains something between 4-10X that amount at any given time.
However, there's a lot more complexity involved, because sometimes you'll actually see Moz report more links than Google Webmaster Tools. This is because Google Webmaster Tools samples (they might know about far more than 120 domains linking to you) and they're not specific about how they treat pages that are, for example, canonicalized by a redirect or a rel canonical. They also don't specify the frequency of their updates, and this appears to be inconsistent as well - we've seen counts go up, down, and all over the place without much indication of why. Hence, lots of site owners become suspicious about the link counts shown in GWMT. In your case, because you have a limited number, you can likely validate that all of these are real and so the problem isn't as big.
As far as Moz's index goes, while we're not as big we are always growing (last index, for example, was ~140 Billion pages - you can see all the updates here: http://moz.com/products/api/updates), we're consistent in what we'll tell you (the exact count and every link we see), and you can track this over time compared to index size (and compared to competitors). Next year, we are aiming to 3X the current size, so you should see something much closer to Google's scale (it's been an ongoing challenge for some time, but we're nearly there).
That said, totally understand if our index isn't right for your needs. Majestic SEO and Ahrefs may both be good options (though neither of them are quite Google's size/freshness/scale either, and they have their own weaknesses vs. Moz's index on other features).
I'll also bring back your suggestion of a page that talks more directly about the comparison between different indices and why Moz might have more/fewer links than other tools (and than its own prior indices) over time.
Wish you all the best!
p.s. George - in you particular case, you may also want to check the "Just Discovered" tab in OSE, which often shows a lot of new links we've found that haven't yet made it into the index (but will make it into the next one). If you've received lots of new links in the last 30-60 days, it will take a month or so for those to make it into our main index, but they'll show up within hours in JDL.
-
Thank you for you answer (and sorry you have to write it again) but I have been using OSE as the benchmark for my clients sites without digging any deeper (why should I?) and you may think me naive for that but there is no qualification on OSE with respect to the data it offers.
It does not state that there are limitations on data presented or that the may be incomplete... hence, my original question. This means that I have been providing my clients with incomplete/unqualified data based on OSE reports.
OSE states for Linking Domains "Gauge your site's influence. Analyze the root domains that link to the URL or domain you've entered." Where is the asterix explaining that the data is not complete?
I am a little bit miffed just now.
I need data that I can stand behind and not data that I have to explain like a snake oil salesman. At the very minimum I need to understand the validity of the data and that should be clearly explained at the top of the OSE page.
Thank you again for the explanation.
George.
-
PRO membership you are getting a lot more tools, not just OSE. The moz community is probably worth the $99 on its own in my opinion.
OSE limitation - I will let an employee tell you the official definition but they do a great job crawling as much as the web as feasibility possible on there servers - but they do crawl a huge % of the web and expanding this each month.
-
Hi
I've answered this question a few times, wish Rand would do a whiteboard friday on this and explain once and for all.
Google is huge and have a vast amount of severs that crawl the entire web (and they don't pick up on every link), moz even though its great, is relatively small and can only crawl a proportion of the web (usually the bigger sites and the most visited sites) and rand has said before that they keep growing there servers and can crawl further and further, but for what you pay for moz $99 a month for them to backwards engineer Google and go as deep as they do, you would probably need to paying a lot more, so I would happily take missing out on a few of the lower domain links for the relatively low price I pay each month.
So if Rank, can do a WBF on this subject that would be great - then I wouldnt have to keep typing the same answer I could just link to his video (Google doesnt like duplicate content).
Also mention above, if you really want to know all your links I use ahrefs and Webmaster tools, but even these aren't great.
If a new websites gets built today and links to you - you still have to wait for Google, Ahrefs to find this website - if this website doesn't get links from else where then it might take these sites a while to find the new website, so even these services aren't perfect.
-
Thank you.
But if that is the case
-
what is the value of OSE as a marketing tool?
-
Where is the definition of OSE limitations?
If this is correct then i will seriously consider my PRO membership.
Thanks again for the answer.
-
-
Google has more and better servers and a much larger index than Moz/OSE.
If you want to get a better picture of a website's backlinks, run reports from several backlink tools (OSE, ahrefs, majestic, webmeup, gwt)
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Unsolved Issue with former site redirect
Hello, After 2 months of starting a new service with a new domain name and a rather flat curve, I decided to invest in auditing tools. To my surprise, when analyzing links with 'Link Explorer', I got the following warning : "You entered the URL crapahute.com which redirects to www.krapahute.com. Click here to analyze www.krapahute.com instead."... So I understand I bought a domain name that was previously in use, 8 years ago. And logically, when they moved from crapahute.com to krapahute.com, the former site owners set a (probably 401) redirect between their former name and their new one. Probably that in the last 8 years they dropped crapahute.com since I was able to buy the domain name last year. I can't understand the warning I'm receiving. since I thought this redirect would become inactive when they dropped the domain name. On the contrary, if I analyze the backlinks to krapahute.com I can indeed see : crapahute.com [no anchor text] 3 2 0 --
Link Explorer | | x_all
redirect More Info
Date first seen - Date lost - Link target
2016-09-05 - N/A - www.krapahute.com So my question is : where is this warning coming from ? I performed some nslookup and dig on my domain and their's just to find out that every thing seems to look normal. Meanwhile, if I type "crapahute" in Google (and most other search engines), my site arrives in 12th page (!), and if I type "crapahute.com", the site "krapahute.com" appears first. Thank you for those reading up to that point ! Any hint on what could cause this ? Should I change of domain name ? Thank you !0 -
MOZ Domain Authority Change frequency
Hello Team, I just want to know - is there any MOZ DA algorithm update frequency because we have not seen any movement in DA on website from last few months. Also, is there any matrices which affecting DA that might we are missing. Thanks in Advance.
Link Explorer | | adlift0 -
How Do You Deal With Duplicate Content On A Retail Site
Hi Guys We make custom portfolios and boxes and every time we have a Moz site crawl, Rogerbot always returns a number of Duplicate Content Issues relating to different products. As per the image below (which I hope is visible?!) Rogerbot has flagged duplicate content onto one product that relates to 5 other different products. For instance there is duplicate content for an A4 Leather Portfolio and an A3 Leather Portfolio and an 11"x17" Leather Portfolio. I can't redirect or canonicalise to just the A4 Portfolio as they are all individually different products. The information on each page although similar, is relevant to each of the products, so rewriting a different blurb on each product page, will not be user friendly for our customers. I could ignore the duplicate content issues, but then that isn't good practise (and also makes for a very unsatisfactory looking Dashboard!) Any ideas?? Nick wzQLgBl
Link Explorer | | nick_HandCo1 -
Fresh Web Explorer for .com.au
Hey guys, just wondering if we can refine results to Australia only with the fresh web explorer?
Link Explorer | | thinkLukeSEO0 -
OSE results vs. Google
Is OSE still a worthy analyzer of the factors breaking down Google rank of a page or website? More so over the past six months, I've seems to noticed how the page comparison in OSE seems to have less and less correspondence to Google rank. For instance, I'm analyzing sites where the top result has a Page Authority rank of 11, Subdomain MozRank of 3.53, outranking a page coming in in spot #14 with Page Authority of 36, Subdomain MozRank of 3.94. I see this frequently, that by the factors measured in OSE, a certain page with favorable scores is being outranked by a page with poor OSE scores. Of course, I realize how sophisticated Google's algorithm is and the OSE can't reverse engineer the whole thing. Also, there are lots of other factors, such as personalized search results, banklink velocity, and who's in my G+ circles, that isn't taken into account in OSE. Yet, I see this more-and-more, where websites kicking ass in OSE measurements rank poor, and poor websites ranked in OSE come out near the top in Google. So, given this, where OSE is apparently a less effective tool to help analyze the factors building Google rank as Google evolves and values brand, what should a chap do? I already know the predictable answers of do RCS, build your community, and so on. But, I want to know what's driving the juice behind these top ranking websites, and OSE doesn't seem to be cutting it anymore... Your thoughts?
Link Explorer | | ExploreConsulting0 -
Since January 24 Open Site Explorer has not provided new data
Using OSE we've been tracking our stats and our competitors stats on a weekly basis to see how we're doing. We've noticed that since Jan 24,2014 the numbers for us and our competitors have been the EXACTLY same. What's going on with OSE? Company_01.png
Link Explorer | | EricksonCoaching0 -
OnPage Grader double counting keywords on responsive site (hidden vs visible)
FYI - it appears that if you have a responsive site that has blocks of text that are duplicated, but Hidden or Visible depending on the screen width, that On-Page Grader will count any keywords in that text twice. I have text shown in one location to Desktop users that needed to be re-located to a different part of the page for Tablet and Phone users to keep the layout nice. And my OP Grader keyword count doesn't match what I saw on the page doing Ctrl-F to find the keywords, unless you count the Hidden text. (not hidden like cloaking or some black hat thing - just not displayed on certain devices) I guess On Page Grader just reads the source code and ignores whether the text is hidden or visible. It would be nice if it read the code as if it was a Desktop device. (suggestion for Moz staff) Does anybody know if Google also ignores device dependent Hidden vs Visible areas???
Link Explorer | | GregB1230 -
Is there an option that's more precise over Open Site Explorer?
I've had folks explain to me before that OpenSiteExplorer is just an estimation, etc. but there are some fairly easy statistics that seem to be different and it makes me nervous that either its wrong or I'm doing something wrong. As you can see in the image, moz isn't read social metrics correctly. It actually used to be pretty spot on, but as you can see with Google+'s its off. Maybe not a big deal right? But for my clients new Moz Analytic print outs it makes somewhat of a difference. Any help? UlwQs3w
Link Explorer | | jonnyholt0