Difference between SEOMOZ and Webmaster Tools information
-
Hello,
There is an issue that confuses me and I thought perhaps you will be able to help me shed some light on it.
I have a website which shows 2,549 crawled pages on SEOMOZ and 24,542 pages on webmaster tools!
Obviously there is some technical issue with the site, but my question is: why the vast difference between what the SEOMOZ crawl report and webmaster tools report show?
Thanks!
Guy Cizner
-
Thanks for stepping in everyone, though it looks like we were trying to answer the wrong question. This one is with Roger's crawl of the OP's own site, rather than links indexed in OSE.
Guy, do you have a feel for how many pages SHOULD be in the index? If you only have a couple of thousand pages, then it could be that Google is crawling and indexing some parameters. If you've got 20k+ pages in the index, then Roger isn't finding some things.
Also..are you looking at perhaps just the www.domain subdomain in SEOmoz and is GWT looking at the entire site? If you had a compact www.domain site, but then had forum.domain and wiki.domain, and GWT was reporting pages for all of the subdomains on domain.com, that would explain things too.
-
hello
thanks for all the replies.
the pages crawled are part of an SEO I am running.
How the crawl is done when a campaign is defined?
I assume all the site is being crawled.
thanks
-
This may also shed some light:
Oct 9, 2012 Keri Morgret On-site Community Manager at SEOmoz:
Another reason is that we just don't have the same size server farm that Google and Bing have. We could crawl all of Twitter and get nothing else crawled, or we could crawl some of Twitter, and some of the rest of the web. We aren't able to crawl all of the web, and we release a new index about once a month, so that's why you don't see all of your links or see them right away.
However, what we do offer that is different from Google and Bing is that we show you links for sites that are not your own, we add metrics about the trust and authority of the page, etc.
-
The Mozscape index, as brilliant as it is, can in no way compete with the size of the index that Google can handle.
As a result, your WMT report should always have a bigger amount of pages, links etc crawled. It's just bigger.
-
Either those 'issues' might be the cause. For example incorrect canonicalization that is picked up differently by Google and the SEOmoz bot Roger. Another option could be that Google tries really hard to index each and every page of the web, while Roger has a slightly more restrictive way of crawling the web by only crawling pages above a certain level of authority / only a certain amount of clicks from the homepage etc.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
SiteName Attribute Showing in Different Language in SERP
We are currently experiencing issues with our subdomain SiteName. Our parent company root domain is a Japanese language site, but we have an English subdomain that is for the United States primarily, and nearly rest of world for organic traffic. Our issue is that we have followed the guidelines here: https://developers.google.com/search/docs/appearance/site-names There was a large post on here with many responses including Googlers with issues others were having, but it has since been removed. Here is the code in place on our homepage: <script
Technical SEO | | Evan_Wright
type="application/ld+json"> { "@context": "https://schema.org",
"@type": "WebSite",
"name": "Mescius Developer Tools",
"alternateName": ["Mescius, inc.", "developer.mescius.com"],
"url": "https://developer.mescius.com" }
</script> Unfortunately this is what is appearing in the SERP. It is using the Japanese equivalent of our parent company. Screenshot 2024-02-23 at 3.37.55 PM.png Even though the relationship between root and subdomain should not be causing this, it seems like something is impacting this incorrect SiteName, and it is impacting CTR for the subdomain. Has anyone else experienced this and found a fix?0 -
Sitemap error in Webmaster tools - 409 error (conflict)
Hey guys, I'm getting this weird error when I submit my sitemap to Google. It says I'm getting a 409 error in my post-sitemap.xml file (https://cleargear.com/post-sitemap.xml). But when I check it, it looks totally fine. I am using YoastSEO to generate the sitemap.xml file. Has anyone else experienced this? Is this a big deal? If so, Does anyone know how to fix? Thanks EwTswL4
Technical SEO | | Extima-Christian0 -
Bing Webmaster Shows Domain without WWW
One of our sites shows thousands of 301 redirects due to domain without www in Bing Webmaster under crawl Information page. It’s been like this for a long time. None of the internal pages have domain without www, it was tested through Screaming Frog. We do have www preference set in google webmaster, but unfortunately bing doesn’t have this option. We also specify URL with www preference through structural data, but that still doesn’t help. Did anyone have similar problems with Bing, and how did you resolve it?
Technical SEO | | rkdc1 -
Install Wordpress on different folders on the same URL
Howdy Mozers! It's a long story, but I have Wordpress installed on my root domain "example.com" and in "example.com/folder" as well. Will this affect my SEO? Should I delete WP from my folder, and build my pages from a folder on the root domain, like "example.com/folder1"? Hope I've managed to properly explain myself 🙂 Thanks!
Technical SEO | | Fernando_0 -
How do I get my pages to go from "Submitted" to "Indexed" in Google Webmaster Tools?
Background: I recently launched a new site and it's performing much better than the old site in terms of bounce rate, page view, pages per session, session duration, and conversions. As suspected, sessions, users, and % new sessions are all down. Which I'm okay with because the the old site had a lot of low quality traffic going to it. The traffic we have now is much more engaged and targeted. Lastly, the site was built using Squarespace and was launched the middle of August. **Question: **When reviewing Google Webmaster Tools' Sitemaps section, I noticed it says 57 web pages Submitted, but only 5 Indexed! The sitemap that's submitted seems to be all there. I'm not sure if this is a Squarespace thing or what. Anyone have any ideas? Thanks!!
Technical SEO | | Nate_D0 -
When you send disavow link in google webmaster?
I am just wondering if you disavow a link from google webmaster to a certain website. Does that hurt the other websites ranking at all? Thanks
Technical SEO | | EVERWORLD.ENTERTAIMENT0 -
Seomoz pages error
Hi
Technical SEO | | looktouchfeel
I have a problem with seomoz, it is saying my website http://www.clearviewtraffic.com has page errors on 19,680 pages. Most of the errors are for duplicate page titles. The website itself doesn't even have 100 pages. Does anyone know how I can fix this? Thanks Luke0 -
Issue with 'Crawl Errors' in Webmaster Tools
Have an issue with a large number of 'Not Found' webpages being listed in Webmaster Tools. In the 'Detected' column, the dates are recent (May 1st - 15th). However, looking clicking into the 'Linked From' column, all of the link sources are old, many from 2009-10. Furthermore, I have checked a large number of the source pages to double check that the links don't still exist, and they don't as I expected. Firstly, I am concerned that Google thinks there is a vast number of broken links on this site when in fact there is not. Secondly, why if the errors do not actually exist (and never actually have) do they remain listed in Webmaster Tools, which claims they were found again this month?! Thirdly, what's the best and quickest way of getting rid of these errors? Google advises that using the 'URL Removal Tool' will only remove the pages from the Google index, NOT from the crawl errors. The info is that if they keep getting 404 returns, it will automatically get removed. Well I don't know how many times they need to get that 404 in order to get rid of a URL and link that haven't existed for 18-24 months?!! Thanks.
Technical SEO | | RiceMedia0