Open Site Explorer Backlink Numbers Wrong?
-
So, I have a new site that we are currently building links for, lots of hard work. Anyway, I was a bit shocked when I viewed my site through Open Site Explorer and found only 4 backlinks!!!!! Alexa.com, Majestic, and Google Webmaster Tools don't show the exact same numbers, but their figures are all MUCH higher, 4 to 566 for example. What gives
Site is www.powerequipmentplus.com for those who want to look themselves.
Thanks for your time and concern.
-
Tim is spot on - it's also worth noting that OSE updates around every 30 days, so if you gain a new link just after the update it will take a few weeks to show up.
Try Majestic SEO - their backlink checker tool is excellent and works very well in combination with OSE.
-
That certainly clears things up. I do usually only use one tool. OSE has been the most consistent so far.
Thanks!
-
Hey DRCS,
It's likely just because SEOmoz had some issues with their most recent update to their index. The update was delayed for a few weeks so the current infomation in OSE is a little outdated. If you website is new, most of your new links probably weren't included in the last index. Don't worry, however, the index is scheduled to be updated again next week.
Even after the next update, you'll notice differences between all the link tracking tools. They all use different methods for crawling the web, scrubbing out junky or old links and processing their data. This results in wildly different numbers. Although it can be good to use a few tools to get a larger picture of your link graph, I find it most helpful to track my success by sticking with one tool and comparing my metrics on a monthly basis.
Hope this helps!
Tim
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Our crawler was not able to access the robots.txt file on your site.
Good morning, Yesterday, Moz gave me an error that is wasn't able to find our robots.txt file. However, this is a new occurrence, we've used Moz and its crawling ability many times prior; not sure why the error is happening now. I validated that the redirects and our robots page are operational and nothing is disallowing Roger in our robots.txt. Any advice or guidance would be much appreciated. https://www.agrisupply.com/robots.txt Thank you for your time. -Danny
Moz Pro | | Danny_Gallagher0 -
Can check my site
can you please tell me what should I do with this site I trying a lot to solve its issues. the link is here https://alazharquranteaching.com/
Moz Pro | | saharali150 -
Cleaning Up Bad 301 External Links From Old Site
A relatively new site I'm working on has been hit really hard by Panda, due to over optimization of 301 external links which include exact keyword phrases, from an old site. Prior to the Panda update, all of these 301 redirects worked like a charm, but now all of these 301's from the old url are killing the new site, because all the hyper-text links include exact keyword matches. A couple weeks ago, I took the old site completely down, and removed the htaccess file, removing the 301's and in effect breaking all of these bad links. Consequently, if one were to type this old url, you'd be directed to the domain registrar, and not redirected to the new site. My hope is to eliminate most of the bad links, that are mostly on spammy sites, that aren't worth linking to. My thought is these links would eventually disappear from G. My concern is that this might not work, because G won't re-index these links, because once they're indexed by G, they'll be there forever. My fear is causing me to conclude I should hedge my bets, and just disavow these sites using the disavow tool in WMT. IMO, the disavow tool is an action of last resort, because I don't want to call attention to myself, since this site doesn't have a manual penalty inflected on it. Any opinions or advise would be greatly appreciated.
Moz Pro | | alrockn0 -
How do you get your web site recrawled with Moz without waiting for a week?
My initial crawl was screwed up because of a no follow that needed to be removed. I would like Moz to recrawl the site right away so I can find any other errors.
Moz Pro | | Ron_McCabe0 -
Open Site Explorer vs Webmaster Tools
Hi there. OSE is showing 53 linking domains and WMT is showing 161.
Moz Pro | | JeromeSavon
Why are so many missing from OSE. They are all links of a decent age. Thanks0 -
SEOmoz link report VS. open site explorer
Hi, I run a campaign on one of my new clients in the links report - i see 1970 - external links and i can press " see more in open site explorer) when i press the button, open site explorer is opened but with a message that there is no link data on this website any advice? Are you familiar with another tool that can help me investigate links to website? Thank you SEOwise
Moz Pro | | iivgi0 -
Moving from a dynamic site to nopcommerce
Hi, First question since becoming a member so be gentle with me ;o) We are moving from a site using a dynamically generated ecommercetemplate to a nopcommerce site. I have two questions about this I know that 301 redirects are the best way to pass "link juice" but with the site being dynamically generated a lot of these links will simply disappear when we move to nop, meaning that they wont actually be there in order to add a 301. Any advice on this would be appreciated. How can i get a list of the pages on my current site which have the best rank in order to create the redirects. Unfortunately due to some technical issues we had with google analytics we were unable to install it so dont really have any analytics to give us some extra info. I was hoping that there would be somewhere within seomoz where i could be directed. Many thanks Chris
Moz Pro | | cjhamill0 -
Any tools for scraping blogroll URLs from sites?
This question is entirely in the whitehat realm... Let's say you've encountered a great blog - with a strong blogroll of 40 sites. The 40-site blogroll is interesting to you for any number of reasons, from link building targets to simply subscribing in your feedreader. Right now, it's tedious to extract the URLs from the site. There are some "save all links" tools, but they are also messy. Are there any good tools that will a) allow you to grab the blogroll (only) of any site into a list of URLs (yeah, ok, it might not be perfect since some sites call it "sites I like" etc.) b) same, but export as OPML so you can subscribe. Thanks! Scott
Moz Pro | | scottclark0