Why is there such a big discrepancy between OSE and GWT regarding # backlinks?
-
Hello,
We have been doing some analysis around our backlink profiles for our sites and have been experiencing a massive discrepancy between what is reported as number of C class linking domains in OSE and the information returned in Google Webmaster tools.
For a variety of sites OSE is reporting numbers < 10 for C class linking doamins while GWT shows >100 unique domains linking (we confirmed that the majority of these links are in different C classes)
Is this simply a matter of the limited index size of OSE or could there be another explanation? It is interesting that the links that do show up in OSE a nearly exclusively sites that we own.
/T
-
The OSE index is smaller than what Google reports in GWT, but then again, the links reported in GTW are famously inaccurate and often not up to date. Google often includes "junk" links that have very little chance of effecting their ranking algorithm.
Linkscape crawls approximately the top 25% of web, which is where the majority of links are found that actually influence rankings. It's not a perfect system, and sometimes links are missed, but it works well as a predictive tool and also for competitive research.
"It is interesting that the links that do show up in OSE a nearly exclusively sites that we own."
Yes, it is interesting. The links Linkscape miss tend to be either beneath layers of navigation, or on pages with few inbound links. Sometimes improving your link structure will help your links both appear in Linkscape, and improve crawling and rankings from Google as well (although I'd be careful in your case, because you own the sites in question, not to create the appearance of a link scheme)
Best of Luck!
-
Neither OSE and WMT is providing the real and full link profile. I have the same issue with several accounts.
But both can be a really good indicator if used together. Personally I use a 3 combo: WMT, OSE and the Majestic tool.
With those 3 powers combined (like in Captain Planet) you will have a really good feedback.
I use WMT as a primary tool when I have access and OSE and Majestic as second tier. I use OSE as the main tool for competition analysis and majestic as a backup - filling some gaps.
You can also try Raven for some additional feedback - but a agin, no tool will provide full100% accurate information.
Hope it helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to tell when a directory backlink or other backlink is worthy of disavow tool? Especially when a keyword is not ranking passed where it should.
Hello, I jumped aboard as SEO for a client, who seems to of had been hit by panda and penguin back in 2012 of April, the panda part I feel I've fixed by creating better content, combining pages that were same topic into one, basically creating a better content experience that relates better to search terms users are searching for. Once the site was redesigned and relaunched all keywords improved minus one, the main keyword they want to rank for. Created a landing page for it, that is very nicely optimized for that keyword and it's brothers and sisters, however that page isn't used by google since it's brand new with a PA of 1. Doing a backlink audit I found 102 links out of 400 using the same anchor text as the keyword they want ranked for, they also have synonyms anchor text for other links too but not quite as much. Most of those 102 domains using the main keyword anchor text are directories, in my opinion I'd declare all of them spam, however there are a few with DAs higher than 50, making me little more nervous to disavow, since I want to make sure we get out of the penalty if we were hit by penguin but also don't want to ruin the ranking for other keywords we're doing better with, since they are longtails and short, but very relevant to users. How is the best way to determine if a site / directory is spammy enough that it's penalizing you and how could I approach the anchor text issue with backlinks? 99% of these links I cannot have changed, since they're directories I doubt many have had a human mess with them in a while. Sidenote* If you're going to post a link as a response, try to summarize what that link will be about, as many times links are giving as an answer but end up not really providing the meat we were seeking. Thank you!
Technical SEO | | Deacyde0 -
GWT crawl errors: How big a ranking issue?
For family reasons (child to look after) I can't keep a close eye on my SEO and SERPs. But from top 10 rankings in January for a dozen keywords I'm now not in top 80 results -- save one keyword for which I'm ~18-20.
Technical SEO | | Jeepster
Not a sitewide penalty: some of my internal pages are still ranking top 3 or so. In GWT, late March I received warning of a rise in server errors:
17 Server Errors/575 soft 404s/17 Not Founds/Access Denied 1/Others 4
I've also got 2 very old sitemaps (from two different ex-SEO firms) & I'm guessing about 75% of the links on there no longer exist. Q: Could all this be behind my calamitous SERPS drop? Or should I be devoting my -- limited -- time to improving my links?0 -
Massive Increase in 404 Errors in GWT
Last June, we transitioned our site to the Magento platform. When we did so, we naturally got an increase in 404 errors for URLs that were not redirected (for a variety of reasons: we hadn't carried the product for years, Google no longer got the same string when it did a "search" on the site, etc.). We knew these would be there and were completely fine with them. We also got many 404s due to the way Magento had implemented their site map (putting in products that were not visible to customers, including all the different file paths to get to a product even though we use a flat structure, etc.). These were frustrating but we did custom work on the site map and let Google resolve those many, many 440s on its own. Sure enough, a few months went by and GWT started to clear out the 404s. All the poor, nonexistent links from the site map and missing links from the old site - they started disappearing from the crawl notices and we slowly went from some 20k 404s to 4k 404s. Still a lot, but we were getting there. Then, in the last 2 weeks, all of those links started showing up again in GWT and reporting as 404s. Now we have 38k 404s (way more than ever reported). I confirmed that these bad links are not showing up in our site map or anything and I'm really not sure how Google found these again. I know, in general, these 404s don't hurt our site. But it just seems so odd. Is there any chance Google bots just randomly crawled a big ol' list of outdated links it hadn't tried for awhile? And does anyone have any advice for clearing them out?
Technical SEO | | Marketing.SCG0 -
Changing the anchor text of a big amount of links at once is bad for SEO?
Hi there, Our service at fotograf.de is a shopsystem for professional photographers. The customers can build their own website with our tool including an onlineshop to sell their pictures. We have a lot of links from our customers linking to our homepage. The links come from subdomains of our domain and from external domains. We are now thinking about changing the anchor text of half of the links (round about 300.000 links). Do we have to fear a penalization of Google for changing so many anchor texts at once? Do we get better rankings if we choose a more optimized anchor text or does this have no effect because most of the links are from subdomains (each customer has its own subdomain) of our domain? Thanks for answering! Sebastian
Technical SEO | | Sebastian230 -
Will having a big list of cities for areas a client services help or damage SEO on a page?
We have a client we inherited that has flat text list of all the cities and counties they service on their contact page. They service the entire southeast so the list just looks crazy ridiculous. --------- Example: ---- South Carolina: Abbeville, Aiken, Allendale, Anderson, Bamberg, Barnwell, Beaufort, Berkeley, Calhoun, Charleston, Cherokee, etc etc ------ end example ------ The question is, will this help or hinder their seo for their very specific niche industry? Is this key word spamming? It has an end-user purpose so it technically isn't spam, but perhaps the engines may look at it otherwise. I couldn't find a definitive answer to the question, any help would be appreciated.
Technical SEO | | Highforge0 -
Unnatural Link Warning No Longer Showing in GWT?
Hi, We recently took on a new client that had been hit by the recent Google updates. After having a really good look at their analytics and their link profile it looked like they had been hit with over-optimisation of anchor text. Over the last month or so we have been working to remove a pile of links that contain their main keyword starting with the easiest to remove and the lowest quality. At the same time we have been building links using sematic keywords and junk anchor text in a bid to dilute the ration of main anchor text within their profile. We have a timetable of tasks drawn-up which we are working through, at the end of the timetable when all tasks were complete we planned to write a very nice reconsideration request to Mr Google. I have logged in to Google Webmaster Tools this morning and I have noticed that the 'Unnatural Links' notice has been removed from that domain. Does anyone know if this signifies anything? We haven't sent a reconsideration request to google yet. Thanks.
Technical SEO | | AdeLewis
Ade.0 -
GWT indexing wrong pages
Hi SEOMoz I have a listings site. In a part of the page, I have 3 comboboxes, for state, county and city. On the change event, the javascript redirects the user to the page of the selected location. Parameters are passed via GET, and my URL is rewrited via htaccess. Example: http:///www.site.com/state/county/city.html The problem is, there is A LOT(more than 10k) of 404 errors. It is happenning because the crawler is trying to index the pages, sometimes WITHOUT a parameter, like http:///www.site.com/state//city.html I don't know how to stop it, and I don't wanna remove it, once it's very clicked by the users. What should I do?
Technical SEO | | elias990 -
The impact of homepage link compared to site-wide backlinks
Hello, I was wondering as to how much a bigger impact a backlink has when its linked site-wide as opposed to homepage only? What if the PA/DA of the homepage was good enough (mid 80s), would just a homepage link give a decent result? I just want to mainly know the difference in the impact of each, regardless of the DA/PA, as long as it comes from one domain. Thank you.
Technical SEO | | micfo0