Is Google Webmaster Tools Index Status Completely Correct?
-
So I was thrilled when Webmaster tools released the little graph telling you how many pages are indexed. However, I noticed with one of my sites, that it was actually going down over the past year. I've only been doing the SEO on the site for a few months, so it wasn't anything I did.
The chart is attached to this post.
Now here's the funny part. I haven't really noticed anything out of the ordinary for keyword ranking dropping off. I also tested my most recent page that I put up. 3 days after I posted it, I could find it in a Google search. Shouldn't this mean that's it's indexed? I can also find any other page I've posted in the last few months.
Another oddity is that I submitted a sitemap a while ago when the site was only 22 pages. The sitemap index count says 20 of those pages are indexed.
The chart only says that there are 3 indexed pages right now. However, I can clearly find dozens of pages in Google searches. Is there something I'm missing? Is my chart for this website broken? Should I report this to Google? Has anyone had a similar issue?
-
Thanks for letting me know that I'm not the only one seeing this. In Analytics, I've got a nice upward line of traffic, so as you said, I'm not too worried about it. Maybe someone else will have more knowledge about the graph itself.
-
Hey Andrew,
Yeah, I've seen the same kinds of issues where that seems to be somewhat inexact. So far, I've not been able to find an explanation. In terms of the large drop off, I did see a similar drop off on one of my sites shortly after correcting a duplicate content issue.
However, like you said, you need to watch for things out of the ordinary. For now, I'm not worrying too much about the graph. It is interesting, but I'll pay more attention to the graph in Google Analytics telling me how much traffic I have from Google. If I see a large drop there, then I'll worry.
Matthew
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Tag Manager for cross-domain tracking
Anybody experience with setting up Google Tag Manager to contain the Analytics script including cross domain tracking? We have a marketingwebsite .com / .com.br and an application running in a subdomain, but have always had some difficulties in getting the cross domain tracking working. Would be great to be able to exchange some experience with fellow Mozzers.
Reporting & Analytics | | jorisbrabants1 -
In Google Analytics, what is the correct format for excluding traffic from single IP address?
In the past, when we've setup a filter to exclude a single IP address, we just entered the IP address normally. For example, 64.68.82.164. However, I was researching how to exclude a range of IP addresses and found out that I might be using the wrong format to filter a single IP address. Is it supposed to be 64.68.82.164 or 64.68.82.164? Will it still work if it was entered without the escape characters? Thanks in advance Moz community for your assistance!
Reporting & Analytics | | peteboyd0 -
Identifying Bots in Google Analytics
Hi there, While you can now filter out bots and spiders in Google Analytics, I'm interested in how you identify a bots and spiders in the first place. For example, it used to be thought that Googlebot wouldn't appear in GA as it 'couldn't process Javascript' but now Google has announced new developments for its crawler with regards to interpreting javascript and CSS, this argument isn't as cut and dry. I'm not suggesting Googlebot appears in Google Analytics, but I am saying that you can't make the case that it won't appear only because it can't interpret JavaScript. So, I'm interested to see what metrics you use to identify a bot? For me, the mix of Users > Browser, Users > Operating System Version is still quite handy, but is it possible to identify individual bots and spiders within Google Analytics? And would Googlebot appear?
Reporting & Analytics | | ecommercebc0 -
Does Google encryption of keyword data impact SEO revenue reporting in Google analytics?
Hi there, I know Google has been encrypting SEO keyword data which they rolled out in September 2013. My question is - will this impact SEO revenue figures reported in Google analytics? I have been monitoring SEO revenue figures for a client and they are significantly down even though rankings have not lowered. Is this because of Google's encryption? Could there be another reason? Many thanks!
Reporting & Analytics | | CayenneRed890 -
Not many pages being indexed on google
Hi I am putting in to Google: site:www.mysite.com to see the pages listed on Google - the figure Google is coming back with is much lower than the actual pages, I have no crawer warning etc... What could the problem be? Thanks
Reporting & Analytics | | acumenadagency0 -
Google analytics reality check?
Looking back over a 9 month period tracking analytics with getclicky my site showed a 29% bounce rate, with only about 1/4 of visitors spending 1 minute or less on my site. I've recently implemented GA (removed old clicky code) and although traffic is strong, my site now shows a bounce rate of about 82%. Engagement stats also show that 82% of visitors spend between 0-10 seconds on my site. My site is built on Wordpress and the GA tracking code wasn't placed directly in the footer, my developer built a field in the admin area to insert the UA number which automatically adds the code to all pages. I've checked the code and the tracking seems to appear on all pages. I took a look at AW Stats. It corroborates GA and says that 80% of visitors are spending 0-30 seconds on the site. Potential issues/clues: browser tests show small loading problems in Internet Explorer 7,8,9 (the phone number at the top of the header loads on the wrong side of the page) and major issues in Internet Explorer 6 (site doesn't load at all in IE 6). The thing is no one who uses IE 6 is coming to the site. Second, the site gets a grade of C in YSlow, it's not lightning fast at the moment. GA is showing average page load of 2.4 seconds, but don't think either of these issues should cause an 82% 0-10 seconds engagement number. My site is content rich/focused with very minimal advertising. Content is accessible well above the fold. My question: Does the fact that AW Stats and GA agree mean that those numbers are accurate, or is there a bug I should be looking for? How to explain the clicky numbers?
Reporting & Analytics | | JSOC0 -
Backlinks on Google Webmaster Tools
As I was reviewing my Google Webmaster tools, I notices a major drop in the number of back links. It used to show over 4,500 links to my site yet the other day it dropped down to 41. None of the directories I've submitted to are showing, nor any blog comments I've posted. Since then, my SEO traffic has started to drop for some keywords. Anyone know why?
Reporting & Analytics | | MikeAndres0 -
Google Analytics - my continuing adventures
Hello I'd appreciate views of the various metrics I'm struggling with in GA: I've run 2 different reports that provide 2 different outputs. 1. In Standard Reporting you can report in Traffic Sources on Organic Search by Keyword, which returns the number of Visits. 2. In Custom Reporting you can define the Keyword dimension and the Organic Searches metric, which returns the number of Organic Searches. This returns 2 different numbers. For example, over the last month for a given term report 1 returns 77,306 visits whilst report 2 returns 52,589 organic searches. I have found some definitions: "Visits represent the number of individual sessions initiated by all the visitors to your site." "Organic Searches: number of organic searches that happened within a session. This metric is search engine agnostic." My understanding of these definitions is that report 2 should return a larger value than report 1 rather than what is happening (i.e. report 1 returns a greater value than report 2). Does anyone have a greater understanding of what these mean and relate to? Does anyone have any views on which metric is more useful? Thanks Neil
Reporting & Analytics | | mccormackmorrison0