Page Count in Webmaster Tools Index Status Versus Page Count in Webmaster Tools Sitemap
-
Greeting MOZ Community:
I run www.nyc-officespace-leader.com, a real estate website in New York City.
The page count in Google Webmaster Tools Index status for our site is 850. The page count in our Webmaster Tools Sitemap is 637. Why is there a discrepancy between the two?
What does the Google Webmaster Tools Index represent? If we filed a removal request for pages we did not want indexed, will these pages still show in the Google Webmaster Tools page count despite the fact that they no longer display in search results? The number of pages displayed in our Google Webmaster Tools Index remains at about 850 despite the removal request. Before a site upgrade in June the number of URLs in the Google Webmaster Tools Index and Google Webmaster Site Map were almost the same.
I am concerned that page bloat has something to do with a recent drop in ranking.
Thanks everyone!!
Alan
-
Using the noindex,follow combination is a form of advanced page sculpting, which is not truly an SEO best practice.
Here's why:
If you deem a page not worthy of being in the Google index, attempting to say "it's not worthy of indexing, but the links on it are worthy" is a mixed message.
Links to those other pages should already exist from pages you do want indexed.
By doing noindex,follow, you increase the internal link counts in artificial ways.
-
Hi Alan:
That is very clear, thanks!!
For pages with thin content, why the "no-follow" in addition to the "no-index"? My SEO firm was also of the opinion that the thin content pages be "no-indexed" however they did not suggest a "no-follow" also.
So I think I will work on improving site speed, enhancing content and no-indexing (and no following?) thin pages. If that does not induce an improvement I guess I will have to consider alternatives.
Thanks,
Alan -
As I already communicated, these are issues that MAY be causing your problems. Without direct access to Google's algorithms, there is zero guarantee that anyone could absolutely say with 100% certainty exactly what impact they are having. And without a full audit, there is no way to know what other problems you have.
Having said that, proper SEO best practices always dictates that any major SEO flaws that you know exist should be cleaned up / fixed. So - if two thirds of your listings have thin content, the best suggestion would be to work to add much more content to each of those (unique, highly relevant, trustworthy and helpful), or to consider a "noindex,nofollow" on those specific pages.
The problem then being if you noindex,nofollow that many pages, what do you have left in terms of overall site scale that Google would find worthy of high rankings? How big are your competitors? Taking away very thin pages helps reduce "low quality" signals, yet if there isn't other "high quality" volume of content you still don't solve all your problems most of the time.
200-250 words is NOT considered a strong volume of content in most cases. Typically these days it's around the 600 words + range. However that also depends on the majority of the competition for that unique type of content in that specific market.
And site speed is also something that best practices dictates needs to be as efficient as possible so if it's slow even intermittently, that would be another thing to definitely work on.
-
Hi Alan:
About maybe 220 pages of the 305 listings have thin content. Meaning less than 100 words.
Is that likely to have triggered a Panda 4.0 penalty in late May? If I add content to those pages of no-index them could that reverse the penalty if it exists. Also my building pages contain 200-250 words. Is that considered "thin"? They are less geared towards the needs of tenants leasing space and contain historical information. I intend to enhance them and display listings on them. Do you think that could help?
Do you think the site speed could be a major factor impacting performance on my site? If so, I can invest in improving speed.
Thanks, Alan
-
thanks for the GA data - so - there's very little traffic to the site so Google isn't able to get accurate page speed data consistently every day.
Note however, that back around July 6th, the site-wide average was almost 40 seconds a page. That's extremely slow. Then on the 17th, it was up around 16 seconds site-wide. So even though the little bit of data the rest of the month shows much faster speeds, those are definitely not good.
I honestly don't know however, given the very small data set, what impact site speed is having on the site. And there's just no way to know how it's impacting the site compared to other problems.
Next - thin content pages - what percentage of the listings has this problem? When I go to a sample listing such as this one I see almost no content. If a significant number of listings you have are this severely thin, that could well be a major problem.
Again though, I don't believe in randomly looking at one, two or even a few individual things as a valid basis for making a wild guess as to exact causes. SEO is not rocket science, however it is computer science. It's complex and hundreds of main factors are involved.
-
Hi Alan:
Interesting tools, URIValet.com, I never heard of it before.
I reviewed site speed on Google Analytics and its seems that intermittently download speeds seem very slow. According to "Site Speed Timings" (see attached) there has been a drop in download speed.
Is download speed a potentially more significant problem than the unknown 175 URLs?
Also, the listing do not appear elsewhere on the web. But many of them have light content. The call to action at the end of the listing is somewhat repetitive. I plan on either no-indexing listings with less than 100 words or adding to the content. The total number of listing URLs is 310. There are also 150 short building write ups URLs (like: http://www.nyc-officespace-leader.com/metropolitan-life-tower). These don't have more than 150 content. Could they be contributing to the issue?
Is the load time for the URLs on this site so slow that it could be affecting ranking?
Thanks,
Alan -
It would require a developer to examine the structure of the site, how pages are generated - to do an inventory audit related to pages generated, then to match that to the sitemap file. If there are a large number of pages that are duplicate content, or very thin on content, that could be a contributing factor. Since there's less than 1,000 pages indexed in Google, I don't think 175 would be enough by itself as a single factor.
There are many reasons that could be causing your problem. Overall quality is another possible factor. In a test I ran just now at URIValet.com, the page processing speed for the home page in the 1.5 mbps emulator was 13 seconds. Since Google has an ideal of under 3 seconds, if you have serious site-wide processing issues, that could also be a contributing factor. A test of a different page came back at 6 seconds, so this isn't necessarily a site-wide problem, and it may even be intermittent.
Yet if there are intermittent times when speeds are even slower, then yes, that could well be a problem that needs fixing.
So many other possible issues exist. Are the property listings anywhere else on the web, or is the content you have on them exclusive to your site?
What about your link profile? Is it questionable?
Without a full blown audit it's a guess as to what the cause of your visibility drop problems are.
-
Hi Alan:
Your hypothesis regarding the URL structure is interesting. But in this case two the URLs represent buildings and the one with "/listings/" represents a listings. SO that seems ok.
Now you mention the possibility that there may be URLs that do not appear in the site map and are getting indexed by Google. That there is a site map issue with the site. How could I determine this?
Could the additional 175 URLs that have appeared in the last two months contribute to a drop in ranking?
I am complete stumped on thus issue and have been harassing the MOZ community for two months. If you could help get the bottom of this I would be most grateful.
Thanks, Alan
-
Hi Keri:
OK. I will keep that in mind moving forward. I did not realize the duplication.
If a question does not get answered are users allowed to repost?
Thanks,
Alan
-
Hi Alan:
Thanks for your response. Actually the 1st and 3rd URL are for buildings rather than listings, so they are actually formatted correctly. All listings contain "/listings/". So I think, but I am not an expert, that the URL structure is OK.
Thanks,
Alan -
There are many reasons this can be happening.
One cause is where more URLs exist than your sitemap might even include. So the question then is whether the sitemap file is accurate and includes all the pages you want indexed.
Sometimes it's a coding or Information Architecture flaw. where content is found multiple ways.
Doing a random check, I found you have listings showing up in three different ways
- http://www.nyc-officespace-leader.com/listings/38-broad-street-between-beaver--manhattan-new-york
- http://www.nyc-officespace-leader.com/113-133-west-18th-street
- http://www.nyc-officespace-leader.com/inquire-about-the-ladders-137-varick-street-to-rent-office-space-in-ny
See those? One has the address as a sub-page beneath "/listings/" the 2nd version does not, and the 3rd URL is entirely different altogether. There should only be one URL structure for all property listings so this would cause me to wonder whether you have properties showing up with two different URLs.
I didn't find duplication, yet it's a flawed URL issue that leaves me wondering if it's a contributing factor.
This is just a scratching on the surface of possibilities. I did check about blog tags and blog date archives, however none of those are indexed, so they're not a cause based on my preliminary evaluation.
-
Noindexed pages should not appear in your "Index Status". I could be wrong but it doesn't make sense to appear there if the page is noindexed.
Doing a site:www.nyc-officespace-leader.com, I get 849 results. Seems normal to me. Again you would probably have to scrutinize your sitemap instead, sitemaps don't always pull all the URLs depending how on you get them.
Based on Screaming Frog, you got about 860 pages and ~200 noindexed pages. Your index status may update eventually.
Its working as is anyway, http://www.nyc-officespace-leader.com/blog/tag/manhattan-office-space
Does not show up in SERPs. I wouldn't use Index Status as definitive but more as directional.
-
Thanks for your response.
I am very suspicious that something is amiss. The number of URLs in MOZ's crawl of our site is about 850, almost exactly the same as is on the crawl of our site. This 850 includes no index pages.
Is it normal for Google to show the total number of pages, even if they are no-index in The Webmaster Tools Index?
I would upload the Excel file of the MOZ crawl but I don't know how to do so.
Thanks,
Alan
-
It's best to just ask the same question once, and clarify if needed in the question itself. This seems real similar to the question you asked at http://moz.com/community/q/difference-in-number-of-urls-in-crawl-sitemaps-index-status-in-webmaster-tools-normal, unless I'm missing something.
-
Index status is how many pages Google has indexed of your site.
Sitemap is different, incase your site has pages that are too deep for Google to find, sitemaps are created as a way to direct Googlebot to crawl pages that they won't necessarily find.
In your case Google indexed more pages than the amount of pages in your sitemap, which is absolutely normal.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does redirecting a duplicate page NOT in Google‘s index pass link juice? (External links not showing in search console)
Hello! We have a powerful page that has been selected by Google as a duplicate page of another page on the site. The duplicate is not indexed by Google, and the referring domains pointing towards that page aren’t recognized by Google in the search console (when looking at the links report). My question is - if we 301 redirect the duplicate page towards the one that Google has selected as canonical, will the link juice be passed to the new page? Thanks!
Intermediate & Advanced SEO | | Lewald10 -
Index, follow on a paginated page with a different rel=canonical URL
Hello, I have a question about meta robots ="index, follow" and rel=canonical on category page pagination. Should the sorted page be <meta name="robots" content="index,follow"></meta name="robots" content="index,follow"> since the rel="canonical" is pointing to a separate page that is different from the URL? Any thoughts on this topic would be awesome. Thanks. Main Category Page
Intermediate & Advanced SEO | | Choice
https://www.site.com/category/
<meta name="robots" content="index,follow"><link rel="canonical" href="https: www.site.com="" category="" "=""></link rel="canonical" href="https:></meta name="robots" content="index,follow"> Sorted Page
https://www.site.com/category/?p=2&dir=asc&order=name
<meta name="robots" content="index, follow"=""><link rel="canonical" href="https: www.site.com="" category="" ?p="2""></link rel="canonical" href="https:></meta name="robots" content="index,> As you can see, the meta robots is telling Google to index https://www.site.com/category/?p=2&dir=asc&order=name , yet saying the canonical page is https://www.site.com/category/?p=2 .0 -
Substantial difference between Number of Indexed Pages and Sitemap Pages
Hey there, I am doing a website audit at the moment. I've notices substantial differences in the number of pages indexed (search console), the number of pages in the sitemap and the number I am getting when I crawl the page with screamingfrog (see below). Would those discrepancies concern you? The website and its rankings seems fine otherwise. Total indexed: 2,360 (Search Consule)
Intermediate & Advanced SEO | | Online-Marketing-Guy
About 2,920 results (Google search "site:example.com")
Sitemap: 1,229 URLs
Screemingfrog Spider: 1,352 URLs Cheers,
Jochen0 -
Better to 301 or de-index 403 pages
Google WMT recently found and called out a large number of old unpublished pages as access denied errors. The pages are tagged "noindex, follow." These old pages are in Google's index. At this point, would it better to 301 all these pages or submit an index removal request or what? Thanks... Darcy
Intermediate & Advanced SEO | | 945010 -
Adding magento shop to webmaster tools
Hi Guys This week is launch week so I'm just finishing off a few things ready for launch. Quick question for e-commerce guys. When adding the new site to webmaster tools, should we be adding each store i.e trespass.com, trespass,com/us and trespass.com/row as seperate sites or one site trespass.com but with 3 xml sitemaps? Thanks
Intermediate & Advanced SEO | | Trespass0 -
Thousands of Web Pages Disappered from Google Index
The site is - http://shop.riversideexports.com We checked webmaster tools, nothing strange. Then we manually resubmitted using webmaster tools about a month ago. Now only seeing about 15 pages indexed. The rest of the sites on our network are heavily indexed and ranking really well. BUT the sites that are using a sub domain are not. Could this be a sub domain issue? If so, how? If not, what is causing this? Please advise. UPDATE: What we can also share is that the site was cleared twice in it's lifetime - all pages deleted and re-generated. The first two times we had full indexing - now this site hovers at 15 results in the index. We have many other sites in the network that have very similar attributes (such as redundant or empty meta) and none have behaved this way. The broader question is how to do we get the indexing back ?
Intermediate & Advanced SEO | | suredone0 -
Indexing specified entry pages
Hi,We are currently working on location based info.Basically, when someone searches from Florida they will get specific Florida results and when they search from California they will specific California results.How does this location based info affect crawling and indexing?Lets say we have location info for googlebot, sometimes they crawl from a New York ip address, sometimes they do it from Texas and sometimes from California. In this case google will index 3 different pages with 3 different prices and a bit different text, and I'm afraid they might see these as some kind of cloaking or suspicious movement because we serve different versions of the page. What's the best way to handle this?
Intermediate & Advanced SEO | | SEODinosaur0 -
202 error page set in robots.txt versus using crawl-able 404 error
We currently have our error page set up as a 202 page that is unreachable by the search engines as it is currently in our robots.txt file. Should the current error page be a 404 error page and reachable by the search engines? Is there more value or is it a better practice to use 404 over a 202? We noticed in our Google Webmaster account we have a number of broken links pointing the site, but the 404 error page was not accessible. If you have any insight that would be great, if you have any questions please let me know. Thanks, VPSEO
Intermediate & Advanced SEO | | VPSEO0