Difference between SEOMOZ and Webmaster Tools information
-
Hello,
There is an issue that confuses me and I thought perhaps you will be able to help me shed some light on it.
I have a website which shows 2,549 crawled pages on SEOMOZ and 24,542 pages on webmaster tools!
Obviously there is some technical issue with the site, but my question is: why the vast difference between what the SEOMOZ crawl report and webmaster tools report show?
Thanks!
Guy Cizner
-
Thanks for stepping in everyone, though it looks like we were trying to answer the wrong question. This one is with Roger's crawl of the OP's own site, rather than links indexed in OSE.
Guy, do you have a feel for how many pages SHOULD be in the index? If you only have a couple of thousand pages, then it could be that Google is crawling and indexing some parameters. If you've got 20k+ pages in the index, then Roger isn't finding some things.
Also..are you looking at perhaps just the www.domain subdomain in SEOmoz and is GWT looking at the entire site? If you had a compact www.domain site, but then had forum.domain and wiki.domain, and GWT was reporting pages for all of the subdomains on domain.com, that would explain things too.
-
hello
thanks for all the replies.
the pages crawled are part of an SEO I am running.
How the crawl is done when a campaign is defined?
I assume all the site is being crawled.
thanks
-
This may also shed some light:
Oct 9, 2012 Keri Morgret On-site Community Manager at SEOmoz:
Another reason is that we just don't have the same size server farm that Google and Bing have. We could crawl all of Twitter and get nothing else crawled, or we could crawl some of Twitter, and some of the rest of the web. We aren't able to crawl all of the web, and we release a new index about once a month, so that's why you don't see all of your links or see them right away.
However, what we do offer that is different from Google and Bing is that we show you links for sites that are not your own, we add metrics about the trust and authority of the page, etc.
-
The Mozscape index, as brilliant as it is, can in no way compete with the size of the index that Google can handle.
As a result, your WMT report should always have a bigger amount of pages, links etc crawled. It's just bigger.
-
Either those 'issues' might be the cause. For example incorrect canonicalization that is picked up differently by Google and the SEOmoz bot Roger. Another option could be that Google tries really hard to index each and every page of the web, while Roger has a slightly more restrictive way of crawling the web by only crawling pages above a certain level of authority / only a certain amount of clicks from the homepage etc.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Different breadcrumbs for each productpage
Hi all, I have a question related to the breadcrumb. We have an e-commerce site. There is a difference in the breadcrumb when navigating to our products vs directly browsing to the URL of the product. When you navigate to the product the breadcrumb looks like this (also in the source code):
Technical SEO | | AMAGARD
Home > Sand > Sandpit sand > Bigbag Sandpit sand type xyz When you visit the product URL directly, the breadcrumb looks like this (also in the source code):
Home > Bigbag Sandpit sand type xyz Looks to me that can be confusing for a search engine and that it is unclear what the site's structure/hierarchy is like (and also for a user of course). Is that true? If yes, does this have a big direct negative impact looking at SEO? Thanks in advance!0 -
Trying to mark as fixed multiple errors on webmaster tools
We have 44,249 errors and I have set up for most of the URLs a 301 redirect. I know exactly which links are correctly redirected my problem is I don't want to mark as fixed each one individually. Is there a way to upload a URL list to webmaster tools and it automatically marks as fixed based on the list.
Technical SEO | | easyoffices0 -
What are the pros and cons of changing my domain from .com to .us in Google webmaster tools?
Hi, I'm migrating my site from a .com domain to local country domains. I'm wondering what to consider if i chose to move the .com to .us domain. What should I consider before deciding? BR
Technical SEO | | Quru0 -
Rank tracker and Rankings report differs
Hi all Is this normal? I have set up a campaign for a site. Tracking a variety of keywords. For one of them, which is a quite important keyword I've been working on I've moved down one step in my rankings report. This is first of all weird because my on page optimization went from grade c to a, and even weirder beacuse if I run Rank Tracker tool on the keyword and the URL I see that I've moved up 6 steps, to 15 in Google. Kinda makes it hard to grasp if I'm on the right path or not! (I've checked and they are both results on google.dk, same URL and same keyword - exact)
Technical SEO | | Budskab0 -
Please recommend a tool to list pages on my site.
I have taken a major hit from the latest update. Site has been online for 10 years, white hat SEO all the way but I do have some legacy pages were I would duplicate title or the description on a new page. Things are just unorganized currently and trying to find the best approach to organizing what I already have as well as track new content. I would like to have a tool that would basically extract a list of my current pages, the title tags and the description in an Excel file. Not sure how the pros organinze the SEO on a site but my biright idea is that I can have a large excel file with the pages listed so I can detect duplicate info. Site only has about 300 pages. Just regular php pages, no CMS. Thanks in advance!
Technical SEO | | Force70 -
Having some weird crawl issues in Google Webmaster Tools
I am having a large amount of errors in the not found section that are linked to old urls that haven't been used for 4 years. Some of the ulrs being linked to are not even in the structure that we used to use for urls. Never the less Google is saying they are now 404ing and there are hundreds of them. I know the best way to attack this is to 301 them, but I was wondering why all of these errors would be popping up. I cant find anything in the google index searching for the link in "" and in webmaster tools it shows unavailable as where these are being linked to from. Any help would be awesome!
Technical SEO | | Gordian1 -
Google webmasters shows 37K not found errors
Hello we are using Joomla as our cms, months ago we used a component to create friendly urls, lots of them got indexed by google, testing the component we created three different types of URL, the problem now is that all of this tests are showing in google webmasters as 404 errors, 37,309 not found pages and this number is increasing everyday. What do you suggest to fix this?? Regards.
Technical SEO | | Zertuxte0