Page Count in Webmaster Tools Index Status Versus Page Count in Webmaster Tools Sitemap
-
Greeting MOZ Community:
I run www.nyc-officespace-leader.com, a real estate website in New York City.
The page count in Google Webmaster Tools Index status for our site is 850. The page count in our Webmaster Tools Sitemap is 637. Why is there a discrepancy between the two?
What does the Google Webmaster Tools Index represent? If we filed a removal request for pages we did not want indexed, will these pages still show in the Google Webmaster Tools page count despite the fact that they no longer display in search results? The number of pages displayed in our Google Webmaster Tools Index remains at about 850 despite the removal request. Before a site upgrade in June the number of URLs in the Google Webmaster Tools Index and Google Webmaster Site Map were almost the same.
I am concerned that page bloat has something to do with a recent drop in ranking.
Thanks everyone!!
Alan
-
Using the noindex,follow combination is a form of advanced page sculpting, which is not truly an SEO best practice.
Here's why:
If you deem a page not worthy of being in the Google index, attempting to say "it's not worthy of indexing, but the links on it are worthy" is a mixed message.
Links to those other pages should already exist from pages you do want indexed.
By doing noindex,follow, you increase the internal link counts in artificial ways.
-
Hi Alan:
That is very clear, thanks!!
For pages with thin content, why the "no-follow" in addition to the "no-index"? My SEO firm was also of the opinion that the thin content pages be "no-indexed" however they did not suggest a "no-follow" also.
So I think I will work on improving site speed, enhancing content and no-indexing (and no following?) thin pages. If that does not induce an improvement I guess I will have to consider alternatives.
Thanks,
Alan -
As I already communicated, these are issues that MAY be causing your problems. Without direct access to Google's algorithms, there is zero guarantee that anyone could absolutely say with 100% certainty exactly what impact they are having. And without a full audit, there is no way to know what other problems you have.
Having said that, proper SEO best practices always dictates that any major SEO flaws that you know exist should be cleaned up / fixed. So - if two thirds of your listings have thin content, the best suggestion would be to work to add much more content to each of those (unique, highly relevant, trustworthy and helpful), or to consider a "noindex,nofollow" on those specific pages.
The problem then being if you noindex,nofollow that many pages, what do you have left in terms of overall site scale that Google would find worthy of high rankings? How big are your competitors? Taking away very thin pages helps reduce "low quality" signals, yet if there isn't other "high quality" volume of content you still don't solve all your problems most of the time.
200-250 words is NOT considered a strong volume of content in most cases. Typically these days it's around the 600 words + range. However that also depends on the majority of the competition for that unique type of content in that specific market.
And site speed is also something that best practices dictates needs to be as efficient as possible so if it's slow even intermittently, that would be another thing to definitely work on.
-
Hi Alan:
About maybe 220 pages of the 305 listings have thin content. Meaning less than 100 words.
Is that likely to have triggered a Panda 4.0 penalty in late May? If I add content to those pages of no-index them could that reverse the penalty if it exists. Also my building pages contain 200-250 words. Is that considered "thin"? They are less geared towards the needs of tenants leasing space and contain historical information. I intend to enhance them and display listings on them. Do you think that could help?
Do you think the site speed could be a major factor impacting performance on my site? If so, I can invest in improving speed.
Thanks, Alan
-
thanks for the GA data - so - there's very little traffic to the site so Google isn't able to get accurate page speed data consistently every day.
Note however, that back around July 6th, the site-wide average was almost 40 seconds a page. That's extremely slow. Then on the 17th, it was up around 16 seconds site-wide. So even though the little bit of data the rest of the month shows much faster speeds, those are definitely not good.
I honestly don't know however, given the very small data set, what impact site speed is having on the site. And there's just no way to know how it's impacting the site compared to other problems.
Next - thin content pages - what percentage of the listings has this problem? When I go to a sample listing such as this one I see almost no content. If a significant number of listings you have are this severely thin, that could well be a major problem.
Again though, I don't believe in randomly looking at one, two or even a few individual things as a valid basis for making a wild guess as to exact causes. SEO is not rocket science, however it is computer science. It's complex and hundreds of main factors are involved.
-
Hi Alan:
Interesting tools, URIValet.com, I never heard of it before.
I reviewed site speed on Google Analytics and its seems that intermittently download speeds seem very slow. According to "Site Speed Timings" (see attached) there has been a drop in download speed.
Is download speed a potentially more significant problem than the unknown 175 URLs?
Also, the listing do not appear elsewhere on the web. But many of them have light content. The call to action at the end of the listing is somewhat repetitive. I plan on either no-indexing listings with less than 100 words or adding to the content. The total number of listing URLs is 310. There are also 150 short building write ups URLs (like: http://www.nyc-officespace-leader.com/metropolitan-life-tower). These don't have more than 150 content. Could they be contributing to the issue?
Is the load time for the URLs on this site so slow that it could be affecting ranking?
Thanks,
Alan -
It would require a developer to examine the structure of the site, how pages are generated - to do an inventory audit related to pages generated, then to match that to the sitemap file. If there are a large number of pages that are duplicate content, or very thin on content, that could be a contributing factor. Since there's less than 1,000 pages indexed in Google, I don't think 175 would be enough by itself as a single factor.
There are many reasons that could be causing your problem. Overall quality is another possible factor. In a test I ran just now at URIValet.com, the page processing speed for the home page in the 1.5 mbps emulator was 13 seconds. Since Google has an ideal of under 3 seconds, if you have serious site-wide processing issues, that could also be a contributing factor. A test of a different page came back at 6 seconds, so this isn't necessarily a site-wide problem, and it may even be intermittent.
Yet if there are intermittent times when speeds are even slower, then yes, that could well be a problem that needs fixing.
So many other possible issues exist. Are the property listings anywhere else on the web, or is the content you have on them exclusive to your site?
What about your link profile? Is it questionable?
Without a full blown audit it's a guess as to what the cause of your visibility drop problems are.
-
Hi Alan:
Your hypothesis regarding the URL structure is interesting. But in this case two the URLs represent buildings and the one with "/listings/" represents a listings. SO that seems ok.
Now you mention the possibility that there may be URLs that do not appear in the site map and are getting indexed by Google. That there is a site map issue with the site. How could I determine this?
Could the additional 175 URLs that have appeared in the last two months contribute to a drop in ranking?
I am complete stumped on thus issue and have been harassing the MOZ community for two months. If you could help get the bottom of this I would be most grateful.
Thanks, Alan
-
Hi Keri:
OK. I will keep that in mind moving forward. I did not realize the duplication.
If a question does not get answered are users allowed to repost?
Thanks,
Alan
-
Hi Alan:
Thanks for your response. Actually the 1st and 3rd URL are for buildings rather than listings, so they are actually formatted correctly. All listings contain "/listings/". So I think, but I am not an expert, that the URL structure is OK.
Thanks,
Alan -
There are many reasons this can be happening.
One cause is where more URLs exist than your sitemap might even include. So the question then is whether the sitemap file is accurate and includes all the pages you want indexed.
Sometimes it's a coding or Information Architecture flaw. where content is found multiple ways.
Doing a random check, I found you have listings showing up in three different ways
- http://www.nyc-officespace-leader.com/listings/38-broad-street-between-beaver--manhattan-new-york
- http://www.nyc-officespace-leader.com/113-133-west-18th-street
- http://www.nyc-officespace-leader.com/inquire-about-the-ladders-137-varick-street-to-rent-office-space-in-ny
See those? One has the address as a sub-page beneath "/listings/" the 2nd version does not, and the 3rd URL is entirely different altogether. There should only be one URL structure for all property listings so this would cause me to wonder whether you have properties showing up with two different URLs.
I didn't find duplication, yet it's a flawed URL issue that leaves me wondering if it's a contributing factor.
This is just a scratching on the surface of possibilities. I did check about blog tags and blog date archives, however none of those are indexed, so they're not a cause based on my preliminary evaluation.
-
Noindexed pages should not appear in your "Index Status". I could be wrong but it doesn't make sense to appear there if the page is noindexed.
Doing a site:www.nyc-officespace-leader.com, I get 849 results. Seems normal to me. Again you would probably have to scrutinize your sitemap instead, sitemaps don't always pull all the URLs depending how on you get them.
Based on Screaming Frog, you got about 860 pages and ~200 noindexed pages. Your index status may update eventually.
Its working as is anyway, http://www.nyc-officespace-leader.com/blog/tag/manhattan-office-space
Does not show up in SERPs. I wouldn't use Index Status as definitive but more as directional.
-
Thanks for your response.
I am very suspicious that something is amiss. The number of URLs in MOZ's crawl of our site is about 850, almost exactly the same as is on the crawl of our site. This 850 includes no index pages.
Is it normal for Google to show the total number of pages, even if they are no-index in The Webmaster Tools Index?
I would upload the Excel file of the MOZ crawl but I don't know how to do so.
Thanks,
Alan
-
It's best to just ask the same question once, and clarify if needed in the question itself. This seems real similar to the question you asked at http://moz.com/community/q/difference-in-number-of-urls-in-crawl-sitemaps-index-status-in-webmaster-tools-normal, unless I'm missing something.
-
Index status is how many pages Google has indexed of your site.
Sitemap is different, incase your site has pages that are too deep for Google to find, sitemaps are created as a way to direct Googlebot to crawl pages that they won't necessarily find.
In your case Google indexed more pages than the amount of pages in your sitemap, which is absolutely normal.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Difficulty with Indexing Pages - Desperate for Help!
I have a website with product pages that use the same URL, but load different data based on what's passed to them with GET. I am using a Wordpress website, but all of the page information is retrieved from a database using PHP and displayed with PHP. Somehow these pages are not being indexed by Google. I have done the following: 1. Created a site map pointing to each page. 2. Defined URL parameters in Search Console for these type of pages. 3. Created a product schema using schema.org, and tested it without errors. I have requested re-indexing repeatedly and these pages and images on the pages are still not being indexed! Does anybody have any suggestions?
Intermediate & Advanced SEO | | jacleaves0 -
Do I need to re-index the page after editing URL?
Hi, I had to edit some of the URLs. But, google is still showing my old URL in search results for certain keywords, which ofc get 404. By crawling with ScremingFrog it gets me 301 'page not found' and still giving old URLs. Why is that? And do I need to re-index pages with new URLs? Is 'fetch as Google' enough to do that or any other advice? Thanks a lot, hope the topic will help to someone else too. Dusan
Intermediate & Advanced SEO | | Chemometec0 -
Thinking about not indexing PDFs on a product page
Our product pages generate a PDF version of the page in a different layout. This is done for 2 reasons, it's been the standard across similar industries and to help customers print them when working with the product. So there is a use when it comes to the customer but search? I've thought about this a lot and my thinking is why index the PDF at all? Only allow the HTML page to be indexed. The PDF files are in a subdomain, so I can easily no index them. The way I see it, I'm reducing duplicate content On the flip side, it is hosted in a subdomain, so the PDF appearing when a HTML page doesn't, is another way of gaining real estate. If it appears with the HTML page, more estate coverage. Anyone else done this? My knowledge tells me this could be a good thing, might even iron out any backlinks from being generated to the PDF and lead to more HTML backlinks Can PDFs solely exist as a form of data accessible once on the page and not relevant to search engines. I find them a bane when they are on a subdomain.
Intermediate & Advanced SEO | | Bio-RadAbs0 -
Is it a problem that Google's index shows paginated page urls, even with canonical tags in place?
Since Google shows more pages indexed than makes sense, I used Google's API and some other means to get everything Google has in its index for a site I'm working on. The results bring up a couple of oddities. It shows a lot of urls to the same page, but with different tracking code.The url with tracking code always follows a question mark and could look like: http://www.MozExampleURL.com?tracking-example http://www.MozExampleURL.com?another-tracking-examle http://www.MozExampleURL.com?tracking-example-3 etc So, the only thing that distinguishes one url from the next is a tracking url. On these pages, canonical tags are in place as: <link rel="canonical<a class="attribute-value">l</a>" href="http://www.MozExampleURL.com" /> So, why does the index have urls that are only different in terms of tracking urls? I would think it would ignore everything, starting with the question mark. The index also shows paginated pages. I would think it should show the one canonical url and leave it at that. Is this a problem about which something should be done? Best... Darcy
Intermediate & Advanced SEO | | 945010 -
Question regarding geo-targeting in Google Webmaster Tools.
I understand that it's possible to target both domains/subdomains and subfolders to different geographical regions in GWT. However, I was wondering about the effect of targeting the domain to a single country, say the UK. Then targeting subfolders to other regions (say the US and France). e.g. www.domain.com -> UK
Intermediate & Advanced SEO | | TranslateMediaLtd
www.domain.com/us -> US
www.domain.com/fr -> France etc Would it be better to leave the main domain without a geographical target but set geo-targeting for the subfolders? Or would it be best to set geo-targeting for both the domain and subfolders.0 -
Does Google still don't index Hashtag Links ? No chance to get a Search Result that leads directly to a section of a page? or to one of numeras Hashtag Pages in a single HTML page?
Does Google still don't index Hashtag Links ? No chance to get a Search Result that leads directly to a section of a page? or to one of numeras Hashtag Pages in a single HTML page? If I have 4 or 5 different hashtag link section pages , consolidated into one HTML Page, no chance to get one of the Hashtag Pages to appear as a search result? like, if under one Single Page Travel Guide I have two essential sections: #Attractions #Visa no chance to direct search queries for Visa directly to the Hashtag Link Section of #Visa? Thanks for any help
Intermediate & Advanced SEO | | Muhammad_Jabali0 -
My indexed pages count is shrinking in webmaster tools. Is this normal ?
I noticed that our total # of indexed pages dropped recently by a substantial amount (see chart below) Is this normal? http://imgur.com/4GWzkph Also, 3 weeks after this started dropping, we got a message on increased # of crawl errors and found that a site update was causing 300+ new 404s. could this be related ?
Intermediate & Advanced SEO | | znotes0 -
How long until Sitemap pages index
I recently submitted an XML sitemap on Webmaster tools: http://www.uncommongoods.com/sitemap.xml Once Webmaster tools downloads it, how long do you typically have to wait until the pages index ?
Intermediate & Advanced SEO | | znotes0