Difference between SEOMOZ and Webmaster Tools information
-
Hello,
There is an issue that confuses me and I thought perhaps you will be able to help me shed some light on it.
I have a website which shows 2,549 crawled pages on SEOMOZ and 24,542 pages on webmaster tools!
Obviously there is some technical issue with the site, but my question is: why the vast difference between what the SEOMOZ crawl report and webmaster tools report show?
Thanks!
Guy Cizner
-
Thanks for stepping in everyone, though it looks like we were trying to answer the wrong question. This one is with Roger's crawl of the OP's own site, rather than links indexed in OSE.
Guy, do you have a feel for how many pages SHOULD be in the index? If you only have a couple of thousand pages, then it could be that Google is crawling and indexing some parameters. If you've got 20k+ pages in the index, then Roger isn't finding some things.
Also..are you looking at perhaps just the www.domain subdomain in SEOmoz and is GWT looking at the entire site? If you had a compact www.domain site, but then had forum.domain and wiki.domain, and GWT was reporting pages for all of the subdomains on domain.com, that would explain things too.
-
hello
thanks for all the replies.
the pages crawled are part of an SEO I am running.
How the crawl is done when a campaign is defined?
I assume all the site is being crawled.
thanks
-
This may also shed some light:
Oct 9, 2012 Keri Morgret On-site Community Manager at SEOmoz:
Another reason is that we just don't have the same size server farm that Google and Bing have. We could crawl all of Twitter and get nothing else crawled, or we could crawl some of Twitter, and some of the rest of the web. We aren't able to crawl all of the web, and we release a new index about once a month, so that's why you don't see all of your links or see them right away.
However, what we do offer that is different from Google and Bing is that we show you links for sites that are not your own, we add metrics about the trust and authority of the page, etc.
-
The Mozscape index, as brilliant as it is, can in no way compete with the size of the index that Google can handle.
As a result, your WMT report should always have a bigger amount of pages, links etc crawled. It's just bigger.
-
Either those 'issues' might be the cause. For example incorrect canonicalization that is picked up differently by Google and the SEOmoz bot Roger. Another option could be that Google tries really hard to index each and every page of the web, while Roger has a slightly more restrictive way of crawling the web by only crawling pages above a certain level of authority / only a certain amount of clicks from the homepage etc.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Japanese URL-structured sitemap (pages) not being indexed by Bing Webmaster Tools
Hello everyone, I am facing an issue with the sitemap submission feature in Bing Webmaster Tools for a Japanese language subdirectory domain project. Just to outline the key points: The website is based on a subdirectory URL ( example.com/ja/ ) The Japanese URLs (when pages are published in WordPress) are not being encoded. They are entered in pure Kanji. Google Webmaster Tools, for instance, has no issues reading and indexing the page's URLs in its sitemap submission area (all pages are being indexed). When it comes to Bing Webmaster Tools it's a different story, though. Basically, after the sitemap has been submitted ( example.com/ja/sitemap.xml ), it does report an error that it failed to download this part of the sitemap: "page-sitemap.xml" (basically the sitemap featuring all the sites pages). That means that no URLs have been submitted to Bing either. My apprehension is that Bing Webmaster Tools does not understand the Japanese URLs (or the Kanji for that matter). Therefore, I generally wonder what the correct way is to go on about this. When viewing the sitemap ( example.com/ja/page-sitemap.xml ) in a web browser, though, the Japanese URL's characters are already displayed as encoded. I am not sure if submitting the Kanji style URLs separately is a solution. In Bing Webmaster Tools this can only be done on the root domain level ( example.com ). However, surely there must be a way to make Bing's sitemap submission understand Japanese style sitemaps? Many thanks everyone for any advice!
Technical SEO | | Hermski0 -
Webmaster Tools "Links to your site" history over time?
Is there a way to see a history of the "links to your site"? I've seen a lot of posts here from people say "I just saw a big drop in my numbers." I don't look at this number enough to be that familiar with it. Is there a way to see if Google has suddenly chopped our numbers? I've poked around a little, but not found a method yet. Thanks, Reeves
Technical SEO | | wreevesc0 -
Wordpress & use of 'www' vs not for webmaster tools - explanation needed
I am having a hard time understanding the issue of canonization of site pages, specifically in regards to the 'www' or 'non-www' versions of a site. And specifically in regards to wordpress. I can see that it doesn't matter whether you type in 'www' or not in the url for a wordpress site, what is going on in the back end that allows this? When I link up to google webmaster tools, should i use www or not? thanks for any help d
Technical SEO | | dnaynay0 -
Banklink tool says my website is still redirecting, but its not... wtf?
Seomoz banklink tool says that my piercelaw.com website is still redirecting to piercelawnc.com. It was but it certainly isnt doing it anymore.
Technical SEO | | jpierce1270 -
Recent Webmaster Tools Glitch Impacting Site Quality?
The ramifications of this would not be specific to myself but to anyone with this type of content on their pages... Maybe someone can chime in here, but I'm not sure how much if at all site errors (for example 404 errors) as reported by Google Webmaster Tools are seen as a factor in site quality, which would impact SEO rankings. Any insight on that alone would be appreciated. I've noticed some fairly new weird stuff going on in the WMT 404 error reports. It seems as though their engine is finding objects within the source code of the page that are NOT links but look a URL, then trying to crawl them and reporting them as broken. I've seen a couple different of cases in my environment that seem to trigger this issue. The easiest one to explain are Google Analytic virtual pageview Javascript calls where for example you might send a virtual pageview back to GA for clicks on outbound links. So in the source code of your page you would have something like: onclick="<a class="attribute-value">_gaq.push(['_trackPageview', '/outboundclick/www.othersite.com']);</a> Although this is obviously not a crawl-able link, sure enough Webmaster Tools now would be reporting the following broken page with a 404: www.mysite.com/outboundclick/www.otherwite.com I've seen other such cases of thing that look like URLs but not actual links being pulled out of the page source and reported as broken links. Has anyone else noticed this? Do 404 instances (in this case false ones) reported by Webmaster Tools impact site quality rankings and SEO? Interesting issue here, I'm looking forward to hear some people's thoughts on this. Chris
Technical SEO | | cbubinas0 -
Different domains
Firstly apologies for the very brief question as I am mainly looking for your thoughts as opposed to specific help about a specific problem. I am working on a site which has two sepreate domains and within one domain, two sub domains. The two different sites both havea high page rank, PR6 each, one is the corporate site and the other is the company blog. There are also two sub domains within the corporate site, again both domains have high pr and tons of content. My question is would it be better to consolidate all the assets under one domain or is it better to keep the sites sepreate, from an seo perspective that is.
Technical SEO | | LiquidTech0 -
Default.aspx and domain name difference
I am getting duplicate page content and duplicate page title errors for www.mydomain.com and www.mydomain.com/default.aspx. I thought that they were the same page, so I'm not sure how to avoid getting the duplicate content and title errors. Thanks for your help!
Technical SEO | | DMacy0 -
Htaccess 301s to 3 different sites
Hi, I'm an htaccess newbie, and I have to redirect and split traffic to three new domains from site A. The original home page has most of the inbound links so I've set up a 301 that goes to site B, the new corporate domain. Options +FollowSymLinks
Technical SEO | | ellenru
RewriteEngine on
RewriteRule (.*) http://www.newdomain.com/$1 [R=301,L] Brand websites C and D need 301s for their folders in site A but I have no idea how to write that in relationship to the first redirect, which really is about the home page, contact and only a few other pages. The urls are duplicates except for the new domain names. They're all on Linux..Site A is about 150 pages, should I write it by page, or can I do some kind of catch all (the first 301) plus the two folders? I'd really appreciate any insight you have and especially if you can show me how to write it. Thanks 🙂0