Page Count in Webmaster Tools Index Status Versus Page Count in Webmaster Tools Sitemap
-
Greeting MOZ Community:
I run www.nyc-officespace-leader.com, a real estate website in New York City.
The page count in Google Webmaster Tools Index status for our site is 850. The page count in our Webmaster Tools Sitemap is 637. Why is there a discrepancy between the two?
What does the Google Webmaster Tools Index represent? If we filed a removal request for pages we did not want indexed, will these pages still show in the Google Webmaster Tools page count despite the fact that they no longer display in search results? The number of pages displayed in our Google Webmaster Tools Index remains at about 850 despite the removal request. Before a site upgrade in June the number of URLs in the Google Webmaster Tools Index and Google Webmaster Site Map were almost the same.
I am concerned that page bloat has something to do with a recent drop in ranking.
Thanks everyone!!
Alan
-
Using the noindex,follow combination is a form of advanced page sculpting, which is not truly an SEO best practice.
Here's why:
If you deem a page not worthy of being in the Google index, attempting to say "it's not worthy of indexing, but the links on it are worthy" is a mixed message.
Links to those other pages should already exist from pages you do want indexed.
By doing noindex,follow, you increase the internal link counts in artificial ways.
-
Hi Alan:
That is very clear, thanks!!
For pages with thin content, why the "no-follow" in addition to the "no-index"? My SEO firm was also of the opinion that the thin content pages be "no-indexed" however they did not suggest a "no-follow" also.
So I think I will work on improving site speed, enhancing content and no-indexing (and no following?) thin pages. If that does not induce an improvement I guess I will have to consider alternatives.
Thanks,
Alan -
As I already communicated, these are issues that MAY be causing your problems. Without direct access to Google's algorithms, there is zero guarantee that anyone could absolutely say with 100% certainty exactly what impact they are having. And without a full audit, there is no way to know what other problems you have.
Having said that, proper SEO best practices always dictates that any major SEO flaws that you know exist should be cleaned up / fixed. So - if two thirds of your listings have thin content, the best suggestion would be to work to add much more content to each of those (unique, highly relevant, trustworthy and helpful), or to consider a "noindex,nofollow" on those specific pages.
The problem then being if you noindex,nofollow that many pages, what do you have left in terms of overall site scale that Google would find worthy of high rankings? How big are your competitors? Taking away very thin pages helps reduce "low quality" signals, yet if there isn't other "high quality" volume of content you still don't solve all your problems most of the time.
200-250 words is NOT considered a strong volume of content in most cases. Typically these days it's around the 600 words + range. However that also depends on the majority of the competition for that unique type of content in that specific market.
And site speed is also something that best practices dictates needs to be as efficient as possible so if it's slow even intermittently, that would be another thing to definitely work on.
-
Hi Alan:
About maybe 220 pages of the 305 listings have thin content. Meaning less than 100 words.
Is that likely to have triggered a Panda 4.0 penalty in late May? If I add content to those pages of no-index them could that reverse the penalty if it exists. Also my building pages contain 200-250 words. Is that considered "thin"? They are less geared towards the needs of tenants leasing space and contain historical information. I intend to enhance them and display listings on them. Do you think that could help?
Do you think the site speed could be a major factor impacting performance on my site? If so, I can invest in improving speed.
Thanks, Alan
-
thanks for the GA data - so - there's very little traffic to the site so Google isn't able to get accurate page speed data consistently every day.
Note however, that back around July 6th, the site-wide average was almost 40 seconds a page. That's extremely slow. Then on the 17th, it was up around 16 seconds site-wide. So even though the little bit of data the rest of the month shows much faster speeds, those are definitely not good.
I honestly don't know however, given the very small data set, what impact site speed is having on the site. And there's just no way to know how it's impacting the site compared to other problems.
Next - thin content pages - what percentage of the listings has this problem? When I go to a sample listing such as this one I see almost no content. If a significant number of listings you have are this severely thin, that could well be a major problem.
Again though, I don't believe in randomly looking at one, two or even a few individual things as a valid basis for making a wild guess as to exact causes. SEO is not rocket science, however it is computer science. It's complex and hundreds of main factors are involved.
-
Hi Alan:
Interesting tools, URIValet.com, I never heard of it before.
I reviewed site speed on Google Analytics and its seems that intermittently download speeds seem very slow. According to "Site Speed Timings" (see attached) there has been a drop in download speed.
Is download speed a potentially more significant problem than the unknown 175 URLs?
Also, the listing do not appear elsewhere on the web. But many of them have light content. The call to action at the end of the listing is somewhat repetitive. I plan on either no-indexing listings with less than 100 words or adding to the content. The total number of listing URLs is 310. There are also 150 short building write ups URLs (like: http://www.nyc-officespace-leader.com/metropolitan-life-tower). These don't have more than 150 content. Could they be contributing to the issue?
Is the load time for the URLs on this site so slow that it could be affecting ranking?
Thanks,
Alan -
It would require a developer to examine the structure of the site, how pages are generated - to do an inventory audit related to pages generated, then to match that to the sitemap file. If there are a large number of pages that are duplicate content, or very thin on content, that could be a contributing factor. Since there's less than 1,000 pages indexed in Google, I don't think 175 would be enough by itself as a single factor.
There are many reasons that could be causing your problem. Overall quality is another possible factor. In a test I ran just now at URIValet.com, the page processing speed for the home page in the 1.5 mbps emulator was 13 seconds. Since Google has an ideal of under 3 seconds, if you have serious site-wide processing issues, that could also be a contributing factor. A test of a different page came back at 6 seconds, so this isn't necessarily a site-wide problem, and it may even be intermittent.
Yet if there are intermittent times when speeds are even slower, then yes, that could well be a problem that needs fixing.
So many other possible issues exist. Are the property listings anywhere else on the web, or is the content you have on them exclusive to your site?
What about your link profile? Is it questionable?
Without a full blown audit it's a guess as to what the cause of your visibility drop problems are.
-
Hi Alan:
Your hypothesis regarding the URL structure is interesting. But in this case two the URLs represent buildings and the one with "/listings/" represents a listings. SO that seems ok.
Now you mention the possibility that there may be URLs that do not appear in the site map and are getting indexed by Google. That there is a site map issue with the site. How could I determine this?
Could the additional 175 URLs that have appeared in the last two months contribute to a drop in ranking?
I am complete stumped on thus issue and have been harassing the MOZ community for two months. If you could help get the bottom of this I would be most grateful.
Thanks, Alan
-
Hi Keri:
OK. I will keep that in mind moving forward. I did not realize the duplication.
If a question does not get answered are users allowed to repost?
Thanks,
Alan
-
Hi Alan:
Thanks for your response. Actually the 1st and 3rd URL are for buildings rather than listings, so they are actually formatted correctly. All listings contain "/listings/". So I think, but I am not an expert, that the URL structure is OK.
Thanks,
Alan -
There are many reasons this can be happening.
One cause is where more URLs exist than your sitemap might even include. So the question then is whether the sitemap file is accurate and includes all the pages you want indexed.
Sometimes it's a coding or Information Architecture flaw. where content is found multiple ways.
Doing a random check, I found you have listings showing up in three different ways
- http://www.nyc-officespace-leader.com/listings/38-broad-street-between-beaver--manhattan-new-york
- http://www.nyc-officespace-leader.com/113-133-west-18th-street
- http://www.nyc-officespace-leader.com/inquire-about-the-ladders-137-varick-street-to-rent-office-space-in-ny
See those? One has the address as a sub-page beneath "/listings/" the 2nd version does not, and the 3rd URL is entirely different altogether. There should only be one URL structure for all property listings so this would cause me to wonder whether you have properties showing up with two different URLs.
I didn't find duplication, yet it's a flawed URL issue that leaves me wondering if it's a contributing factor.
This is just a scratching on the surface of possibilities. I did check about blog tags and blog date archives, however none of those are indexed, so they're not a cause based on my preliminary evaluation.
-
Noindexed pages should not appear in your "Index Status". I could be wrong but it doesn't make sense to appear there if the page is noindexed.
Doing a site:www.nyc-officespace-leader.com, I get 849 results. Seems normal to me. Again you would probably have to scrutinize your sitemap instead, sitemaps don't always pull all the URLs depending how on you get them.
Based on Screaming Frog, you got about 860 pages and ~200 noindexed pages. Your index status may update eventually.
Its working as is anyway, http://www.nyc-officespace-leader.com/blog/tag/manhattan-office-space
Does not show up in SERPs. I wouldn't use Index Status as definitive but more as directional.
-
Thanks for your response.
I am very suspicious that something is amiss. The number of URLs in MOZ's crawl of our site is about 850, almost exactly the same as is on the crawl of our site. This 850 includes no index pages.
Is it normal for Google to show the total number of pages, even if they are no-index in The Webmaster Tools Index?
I would upload the Excel file of the MOZ crawl but I don't know how to do so.
Thanks,
Alan
-
It's best to just ask the same question once, and clarify if needed in the question itself. This seems real similar to the question you asked at http://moz.com/community/q/difference-in-number-of-urls-in-crawl-sitemaps-index-status-in-webmaster-tools-normal, unless I'm missing something.
-
Index status is how many pages Google has indexed of your site.
Sitemap is different, incase your site has pages that are too deep for Google to find, sitemaps are created as a way to direct Googlebot to crawl pages that they won't necessarily find.
In your case Google indexed more pages than the amount of pages in your sitemap, which is absolutely normal.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
To remove or not remove a redirected page from index
We have a promotion landing page which earned some valuable inbound links. Now that the promotion is over, we have redirected this page to a current "evergreen" page. But in the search results page on Google, the original promotion landing page is still showing as a top result. When clicked, it properly redirects to the newer evergreen page. But, it's a bit problematic for the original promo page to show in the search results because the snippet mentions specifics of the promo which is no longer active. So, I'm wondering what would be the net impact of using the "removal request " tool for the original page in GSC. If we don't use that tool, what kind of timing might we expect before the original page drops out of the results in favor of the new redirected page? And if we do use the removal tool on the original page, will that negate what we are attempting to do by redirecting to the new page, with regard to preserving inbound link equity?
Intermediate & Advanced SEO | | seoelevated0 -
Robots blocked by pages webmasters tools
a mistake made in software. How can I solve the problem quickly? help me. XTRjH
Intermediate & Advanced SEO | | mihoreis0 -
Google indexing wrong pages
We have a variety of issues at the moment, and need some advice. First off, we have a HUGE indexing issue across our entire website. Website in question: http://www.localsearch.com.au/ Firstly
Intermediate & Advanced SEO | | localdirectories
In Google.com.au, if you search for 'plumbers gosford' (https://www.google.com.au/#q=plumbers+gosford), the wrong page appears - in this instance, the page ranking should be http://www.localsearch.com.au/Gosford,NSW/Plumbers I can see this across the board, across multiple locations. Secondly
Recently I've seen Google reporting in 'Crawl Errors' in webmaster tools URLs such as:
http://www.localsearch.com.au/Saunders-Beach,QLD/Electronic-Equipment-Sales-Repairs&Sa=U&Ei=xs-XVJzAA9T_YQSMgIHQCw&Ved=0CIMBEBYwEg&Usg=AFQjCNHXPrZZg0JU3O4yTGjWbijon1Q8OA This is an invalid URL, and more specifically, those query strings seem to be referrer queries from Google themselves: &Sa=U&Ei=xs-XVJzAA9T_YQSMgIHQCw&Ved=0CIMBEBYwEg&Usg=AFQjCNHXPrZZg0JU3O4yTGjWbijon1Q8OA Here's the above example indexed in Google: https://www.google.com.au/#q="AFQjCNHXPrZZg0JU3O4yTGjWbijon1Q8OA" Does anyone have any advice on those 2 errors?0 -
Google Not Indexing XML Sitemap Images
Hi Mozzers, We are having an issue with our XML sitemap images not being indexed. The site has over 39,000 pages and 17,500 images submitted in GWT. If you take a look at the attached screenshot, 'GWT Images - Not Indexed', you can see that the majority of the pages are being indexed - but none of the images are. The first thing you should know about the images is that they are hosted on a content delivery network (CDN), rather than on the site itself. However, Google advice suggests hosting on a CDN is fine - see second screenshot, 'Google CDN Advice'. That advice says to either (i) ensure the hosting site is verified in GWT or (ii) submit in robots.txt. As we can't verify the hosting site in GWT, we had opted to submit via robots.txt. There are 3 sitemap indexes: 1) http://www.greenplantswap.co.uk/sitemap_index.xml, 2) http://www.greenplantswap.co.uk/sitemap/plant_genera/listings.xml and 3) http://www.greenplantswap.co.uk/sitemap/plant_genera/plants.xml. Each sitemap index is split up into often hundreds or thousands of smaller XML sitemaps. This is necessary due to the size of the site and how we have decided to pull URLs in. Essentially, if we did it another way, it may have involved some of the sitemaps being massive and thus taking upwards of a minute to load. To give you an idea of what is being submitted to Google in one of the sitemaps, please see view-source:http://www.greenplantswap.co.uk/sitemap/plant_genera/4/listings.xml?page=1. Originally, the images were SSL, so we decided to reverted to non-SSL URLs as that was an easy change. But over a week later, that seems to have had no impact. The image URLs are ugly... but should this prevent them from being indexed? The strange thing is that a very small number of images have been indexed - see http://goo.gl/P8GMn. I don't know if this is an anomaly or whether it suggests no issue with how the images have been set up - thus, there may be another issue. Sorry for the long message but I would be extremely grateful for any insight into this. I have tried to offer as much information as I can, however please do let me know if this is not enough. Thank you for taking the time to read and help. Regards, Mark Oz6HzKO rYD3ICZ
Intermediate & Advanced SEO | | edlondon0 -
Urgent Site Migration Help: 301 redirect from legacy to new if legacy pages are NOT indexed but have links and domain/page authority of 50+?
Sorry for the long title, but that's the whole question. Notes: New site is on same domain but URLs will change because URL structure was horrible Old site has awful SEO. Like real bad. Canonical tags point to dev. subdomain (which is still accessible and has robots.txt, so the end result is old site IS NOT INDEXED by Google) Old site has links and domain/page authority north of 50. I suspect some shady links but there have to be good links as well My guess is that since that are likely incoming links that are legitimate, I should still attempt to use 301s to the versions of the pages on the new site (note: the content on the new site will be different, but in general it'll be about the same thing as the old page, just much improved and more relevant). So yeah, I guess that's it. Even thought the old site's pages are not indexed, if the new site is set up properly, the 301s won't pass along the 'non-indexed' status, correct? Thanks in advance for any quick answers!
Intermediate & Advanced SEO | | JDMcNamara0 -
Blog Subscribers Count Tool
So, I have about 100 different blogs that I am starting to organize into a list of potential guest blog opportunities. I wanted to see how many subscribers each blog had to their site to better help identify the top influential blogs. The only way I know how to do this is to search for the site inside of Google reader. I was wondering if anybody knows of a tool or knows a way to scrape the blog subscriber count from Google docs, or something more scalable. Thanks, Jason
Intermediate & Advanced SEO | | Jason_3420 -
Problem with 404 and 500 Status code pages
Dear SeoMozzers, I have a question related to one of the sites I have recently changed the URL, going from http:example.com to http://www.example.com I did 301 redirects, as I was recommended to do. In the past month I have noticed an incredible drop in Google's rankings for many keywords and checking the crawling errors appearing in the SEO Crawling Report I have witnessed mayhem with Canonical/301 redirect types of errors. Now, things seem a little better. I have noticed a reduction in the number of 301 and Canonical type or errors (by the way, I still do not get the Canonical issue :-)). My little questions are the following: Will I ever go back to the positions I used to occupy before I redesigned the site's URL structure? I have now noticed that the SeoMoz Crawling report show "404 Staus" errors and one "505 Status" error. Can somebody please tell me how to fix the 404 Status Errors? Can I fix them by myself, or maybe I can ask the guys at the web hosting company, since I am really bad at taking care of technical issues? Thank you for the time you took to clarify my doubts. Ad maiora, Sal
Intermediate & Advanced SEO | | salvyy0 -
400 errors and URL parameters in Google Webmaster Tools
On our website we do a lot of dynamic resizing of images by using a script which automatically re-sizes an image dependant on paramaters in the URL like: www.mysite.com/images/1234.jpg?width=100&height=200&cut=false In webmaster tools I have noticed there are a lot of 400 errors on these image Also when I click the URL's listed as causing the errors the URL's are URL Encoded and go to pages like this (this give a bad request): www.mysite.com/images/1234.jpg?%3Fwidth%3D100%26height%3D200%26cut%3Dfalse What are your thoughts on what I should do to stop this? I notice in my webmaster tools "URL Parameters" there are parameters for:
Intermediate & Advanced SEO | | James77
height
width
cut which must be from the Image URLs. These are currently set to "Let Google Decide", but should I change them manually to "Doesn't effect page content"? Thanks in advance0