Page Count in Webmaster Tools Index Status Versus Page Count in Webmaster Tools Sitemap
-
Greeting MOZ Community:
I run www.nyc-officespace-leader.com, a real estate website in New York City.
The page count in Google Webmaster Tools Index status for our site is 850. The page count in our Webmaster Tools Sitemap is 637. Why is there a discrepancy between the two?
What does the Google Webmaster Tools Index represent? If we filed a removal request for pages we did not want indexed, will these pages still show in the Google Webmaster Tools page count despite the fact that they no longer display in search results? The number of pages displayed in our Google Webmaster Tools Index remains at about 850 despite the removal request. Before a site upgrade in June the number of URLs in the Google Webmaster Tools Index and Google Webmaster Site Map were almost the same.
I am concerned that page bloat has something to do with a recent drop in ranking.
Thanks everyone!!
Alan
-
Using the noindex,follow combination is a form of advanced page sculpting, which is not truly an SEO best practice.
Here's why:
If you deem a page not worthy of being in the Google index, attempting to say "it's not worthy of indexing, but the links on it are worthy" is a mixed message.
Links to those other pages should already exist from pages you do want indexed.
By doing noindex,follow, you increase the internal link counts in artificial ways.
-
Hi Alan:
That is very clear, thanks!!
For pages with thin content, why the "no-follow" in addition to the "no-index"? My SEO firm was also of the opinion that the thin content pages be "no-indexed" however they did not suggest a "no-follow" also.
So I think I will work on improving site speed, enhancing content and no-indexing (and no following?) thin pages. If that does not induce an improvement I guess I will have to consider alternatives.
Thanks,
Alan -
As I already communicated, these are issues that MAY be causing your problems. Without direct access to Google's algorithms, there is zero guarantee that anyone could absolutely say with 100% certainty exactly what impact they are having. And without a full audit, there is no way to know what other problems you have.
Having said that, proper SEO best practices always dictates that any major SEO flaws that you know exist should be cleaned up / fixed. So - if two thirds of your listings have thin content, the best suggestion would be to work to add much more content to each of those (unique, highly relevant, trustworthy and helpful), or to consider a "noindex,nofollow" on those specific pages.
The problem then being if you noindex,nofollow that many pages, what do you have left in terms of overall site scale that Google would find worthy of high rankings? How big are your competitors? Taking away very thin pages helps reduce "low quality" signals, yet if there isn't other "high quality" volume of content you still don't solve all your problems most of the time.
200-250 words is NOT considered a strong volume of content in most cases. Typically these days it's around the 600 words + range. However that also depends on the majority of the competition for that unique type of content in that specific market.
And site speed is also something that best practices dictates needs to be as efficient as possible so if it's slow even intermittently, that would be another thing to definitely work on.
-
Hi Alan:
About maybe 220 pages of the 305 listings have thin content. Meaning less than 100 words.
Is that likely to have triggered a Panda 4.0 penalty in late May? If I add content to those pages of no-index them could that reverse the penalty if it exists. Also my building pages contain 200-250 words. Is that considered "thin"? They are less geared towards the needs of tenants leasing space and contain historical information. I intend to enhance them and display listings on them. Do you think that could help?
Do you think the site speed could be a major factor impacting performance on my site? If so, I can invest in improving speed.
Thanks, Alan
-
thanks for the GA data - so - there's very little traffic to the site so Google isn't able to get accurate page speed data consistently every day.
Note however, that back around July 6th, the site-wide average was almost 40 seconds a page. That's extremely slow. Then on the 17th, it was up around 16 seconds site-wide. So even though the little bit of data the rest of the month shows much faster speeds, those are definitely not good.
I honestly don't know however, given the very small data set, what impact site speed is having on the site. And there's just no way to know how it's impacting the site compared to other problems.
Next - thin content pages - what percentage of the listings has this problem? When I go to a sample listing such as this one I see almost no content. If a significant number of listings you have are this severely thin, that could well be a major problem.
Again though, I don't believe in randomly looking at one, two or even a few individual things as a valid basis for making a wild guess as to exact causes. SEO is not rocket science, however it is computer science. It's complex and hundreds of main factors are involved.
-
Hi Alan:
Interesting tools, URIValet.com, I never heard of it before.
I reviewed site speed on Google Analytics and its seems that intermittently download speeds seem very slow. According to "Site Speed Timings" (see attached) there has been a drop in download speed.
Is download speed a potentially more significant problem than the unknown 175 URLs?
Also, the listing do not appear elsewhere on the web. But many of them have light content. The call to action at the end of the listing is somewhat repetitive. I plan on either no-indexing listings with less than 100 words or adding to the content. The total number of listing URLs is 310. There are also 150 short building write ups URLs (like: http://www.nyc-officespace-leader.com/metropolitan-life-tower). These don't have more than 150 content. Could they be contributing to the issue?
Is the load time for the URLs on this site so slow that it could be affecting ranking?
Thanks,
Alan -
It would require a developer to examine the structure of the site, how pages are generated - to do an inventory audit related to pages generated, then to match that to the sitemap file. If there are a large number of pages that are duplicate content, or very thin on content, that could be a contributing factor. Since there's less than 1,000 pages indexed in Google, I don't think 175 would be enough by itself as a single factor.
There are many reasons that could be causing your problem. Overall quality is another possible factor. In a test I ran just now at URIValet.com, the page processing speed for the home page in the 1.5 mbps emulator was 13 seconds. Since Google has an ideal of under 3 seconds, if you have serious site-wide processing issues, that could also be a contributing factor. A test of a different page came back at 6 seconds, so this isn't necessarily a site-wide problem, and it may even be intermittent.
Yet if there are intermittent times when speeds are even slower, then yes, that could well be a problem that needs fixing.
So many other possible issues exist. Are the property listings anywhere else on the web, or is the content you have on them exclusive to your site?
What about your link profile? Is it questionable?
Without a full blown audit it's a guess as to what the cause of your visibility drop problems are.
-
Hi Alan:
Your hypothesis regarding the URL structure is interesting. But in this case two the URLs represent buildings and the one with "/listings/" represents a listings. SO that seems ok.
Now you mention the possibility that there may be URLs that do not appear in the site map and are getting indexed by Google. That there is a site map issue with the site. How could I determine this?
Could the additional 175 URLs that have appeared in the last two months contribute to a drop in ranking?
I am complete stumped on thus issue and have been harassing the MOZ community for two months. If you could help get the bottom of this I would be most grateful.
Thanks, Alan
-
Hi Keri:
OK. I will keep that in mind moving forward. I did not realize the duplication.
If a question does not get answered are users allowed to repost?
Thanks,
Alan
-
Hi Alan:
Thanks for your response. Actually the 1st and 3rd URL are for buildings rather than listings, so they are actually formatted correctly. All listings contain "/listings/". So I think, but I am not an expert, that the URL structure is OK.
Thanks,
Alan -
There are many reasons this can be happening.
One cause is where more URLs exist than your sitemap might even include. So the question then is whether the sitemap file is accurate and includes all the pages you want indexed.
Sometimes it's a coding or Information Architecture flaw. where content is found multiple ways.
Doing a random check, I found you have listings showing up in three different ways
- http://www.nyc-officespace-leader.com/listings/38-broad-street-between-beaver--manhattan-new-york
- http://www.nyc-officespace-leader.com/113-133-west-18th-street
- http://www.nyc-officespace-leader.com/inquire-about-the-ladders-137-varick-street-to-rent-office-space-in-ny
See those? One has the address as a sub-page beneath "/listings/" the 2nd version does not, and the 3rd URL is entirely different altogether. There should only be one URL structure for all property listings so this would cause me to wonder whether you have properties showing up with two different URLs.
I didn't find duplication, yet it's a flawed URL issue that leaves me wondering if it's a contributing factor.
This is just a scratching on the surface of possibilities. I did check about blog tags and blog date archives, however none of those are indexed, so they're not a cause based on my preliminary evaluation.
-
Noindexed pages should not appear in your "Index Status". I could be wrong but it doesn't make sense to appear there if the page is noindexed.
Doing a site:www.nyc-officespace-leader.com, I get 849 results. Seems normal to me. Again you would probably have to scrutinize your sitemap instead, sitemaps don't always pull all the URLs depending how on you get them.
Based on Screaming Frog, you got about 860 pages and ~200 noindexed pages. Your index status may update eventually.
Its working as is anyway, http://www.nyc-officespace-leader.com/blog/tag/manhattan-office-space
Does not show up in SERPs. I wouldn't use Index Status as definitive but more as directional.
-
Thanks for your response.
I am very suspicious that something is amiss. The number of URLs in MOZ's crawl of our site is about 850, almost exactly the same as is on the crawl of our site. This 850 includes no index pages.
Is it normal for Google to show the total number of pages, even if they are no-index in The Webmaster Tools Index?
I would upload the Excel file of the MOZ crawl but I don't know how to do so.
Thanks,
Alan
-
It's best to just ask the same question once, and clarify if needed in the question itself. This seems real similar to the question you asked at http://moz.com/community/q/difference-in-number-of-urls-in-crawl-sitemaps-index-status-in-webmaster-tools-normal, unless I'm missing something.
-
Index status is how many pages Google has indexed of your site.
Sitemap is different, incase your site has pages that are too deep for Google to find, sitemaps are created as a way to direct Googlebot to crawl pages that they won't necessarily find.
In your case Google indexed more pages than the amount of pages in your sitemap, which is absolutely normal.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Best Practice Approaches to Canonicals vs. Indexing in Google Sitemap vs. No Follow Tags
Hi There, I am working on the following website: https://wave.com.au/ I have become aware that there are different pages that are competing for the same keywords. For example, I just started to update a core, category page - Anaesthetics (https://wave.com.au/job-specialties/anaesthetics/) to focus mainly around the keywords ‘Anaesthetist Jobs’. But I have recognized that there are ongoing landing pages that contain pretty similar content: https://wave.com.au/anaesthetists/ https://wave.com.au/asa/ We want to direct organic traffic to our core pages e.g. (https://wave.com.au/job-specialties/anaesthetics/). This then leads me to have to deal with the duplicate pages with either a canonical link (content manageable) or maybe alternatively adding a no-follow tag or updating the robots.txt. Our resident developer also suggested that it might be good to use Google Index in the sitemap to tell Google that these are of less value? What is the best approach? Should I add a canonical link to the landing pages pointing it to the category page? Or alternatively, should I use the Google Index? Or even another approach? Any advice would be greatly appreciated. Thanks!
Intermediate & Advanced SEO | | Wavelength_International0 -
How can I optimize pages in an index stack
I have created an index stack. My home page is http://www.southernwhitewater.com My home page (if your look at it through moz bat for chrome bar} incorporates all the pages in the index. Is this Bad? I would prefer to index each page separately. As per my site index in the footer What is the best way to optimize all these pages individually and still have the customers arrive at the top and links directed to the home page ( which is actually the 1st page). I feel I am going to need a rel=coniacal might be needed somewhere. Any help would be great!!
Intermediate & Advanced SEO | | VelocityWebsites0 -
Custom sitemap or sitemap generator tool
I have recently launched a website which is using a free sitemap generator (http://web-site-map.com/). It's a large travel agency site (www.yougoadventure.com) with predominantly dynamically generated content - users can add their products as and when and be listed automatically. The guy doing the programming for the site says the sitemap generator is not up to the job and that I should be ranking far better for certain search terms than the site is now. He reckons it doesn't provide lastmod info and the sitemap should be submitted every time a new directory is added or change made. He seems to think that I need to spend £400-£500 for him to custom build a site map. Surely there's a cheaper option out there for a sitemap that can be generated daily or 'ping' google every-time an addition to the site is made or product added? Sorry for the non tech speak - Ive got my web designer telling one thing and the programmer another so im just left trawling through Q&As. Thanks
Intermediate & Advanced SEO | | Curran0 -
Is it dangerous to use "Fetch as Google" too much in Webmaster Tools?
I saw some people freaking out about this on some forums and thought I would ask. Are you aware of there being any downside to use "Fetch as Google" often? Is it a bad thing to do when you create a new page or blog post, for example?
Intermediate & Advanced SEO | | BlueLinkERP0 -
How to get around Google Removal tool not removing redirected and 404 pages? Or if you don't know the anchor text?
Hello! I can’t get squat for an answer in GWT forums. Should have brought this problem here first… The Google Removal Tool doesn't work when the original page you're trying to get recached redirects to another site. Google still reads the site as being okay, so there is no way for me to get the cache reset since I don't what text was previously on the page. For example: This: | http://0creditbalancetransfer.com/article375451_influencial_search_results_for_.htm | Redirects to this: http://abacusmortgageloans.com/GuaranteedPersonaLoanCKBK.htm?hop=duc01996 I don't even know what was on the first page. And when it redirects, I have no way of telling Google to recache the page. It's almost as if the site got deindexed, and they put in a redirect. Then there is crap like this: http://aniga.x90x.net/index.php?q=Recuperacion+Discos+Fujitsu+www.articulo.org/articulo/182/recuperacion_de_disco_duro_recuperar_datos_discos_duros_ii.html No links to my site are on there, yet Google's indexed links say that the page is linking to me. It isn't, but because I don't know HOW the page changed text-wise, I can't get the page recached. The tool also doesn't work when a page 404s. Google still reads the page as being active, but it isn't. What are my options? I literally have hundreds of such URLs. Thanks!
Intermediate & Advanced SEO | | SeanGodier0 -
Minimum word count per page?
I'm seeding a new site with hundreds of (high quality) posts, but since I am paying per word written, I'm wondering if anybody in the community has any anecdotal evidence as to how many words of content there should now be for a page to be counted just the same as a 700 word+ post, for example? I know there are always examples of pages ranking well with, for instance, 50 words or less of content, but does anyone have any strong evidence on what the minimum count should be, or has anyone read anything very informative in regards to this issue? Thanks a lot in advance!
Intermediate & Advanced SEO | | corp08030 -
Can a XML sitemap index point to other sitemaps indexes?
We have a massive site that is having some issue being fully crawled due to some of our site architecture and linking. Is it possible to have a XML sitemap index point to other sitemap indexes rather than standalone XML sitemaps? Has anyone done this successfully? Based upon the description here: http://sitemaps.org/protocol.php#index it seems like it should be possible. Thanks in advance for your help!
Intermediate & Advanced SEO | | CareerBliss0