Total Indexed 1.5M vs 83k submitted by sitemap. What?
-
We recently took a good look at one of our content site's sitemap and tried to cut out a lot of crap that had gotten in there such as .php, .xml, .htm versions of each page. We also cut out images to put in a separate image sitemap.
The sitemap generated 83,000+ URLs for google to crawl (this partially used the Yoast Wordpress plugin to generate)
In webmaster tools in the index status section is showing that this site has a total index of 1.5 million.
With our sitemap coming back with 83k and google indexing 1.5 million pages, is this a sign of a CMS gone rogue? Is it an indication that we could be pumping out error pages or empty templates, or junk pages that we're cramming into Google's bot?
I would love to hear what you guys think. Is this normal? Is this something to be concerned about? Should our total index more closely match our sitemap page count?
-
As well as parameters mentioned you may possibly have heaps of duplicating categories, tags etc. What I would also do is start searching Google with something like site:www.example.com/directory/ or possibly site:www.example.com/category/directory/directory/ so you are tightly narrowing down the results, switch to 100 results per page and manually look for clues.
-
If you have 1.5 million pages and you think your sitemap is comprehensive at 83,000 then yes, your CMS is needlessly generating pages. It's usually not a big deal from a ranking standpoint, but it can make other important issues hard to detect. I would clean it up, but that's a business call you'll have to make.
The first step is diagnosing where are the URLs are coming from. What you do next will depend, but I will give you the best advice I can without knowing what types of extraneous URLs you have and how Google is treating them:
First, I'd start with WMT > Crawl > URL Parameters. Quite often your CMS will generate URLs, and Google usually knows how to handle them. If there are a lot of URL parameters, Google them and see if they're exactly the same as other pages. If they are, make sure you have canonical tags in place to point them to the main version. There's more you can do with parameters, but it'll depend on what you find so I won't go into more detail. As a general rule, though, a CMS should not generate a page unless it is uniquely useful as differentiated landing page or a page for people to link to.
Also check for parameters in your analytics program. They could actually be messing up your pageview data depending on how you report.There's a post on fixing that in GA here:
http://blog.crazyegg.com/2013/03/29/remove-url-parameters-from-google-analytics-reports/
Next I'd look at the "Advanced" tab in WMT > Google Index > Index Status . Are there a lot of URLs removed? If so, check on these pages and see why they're removed and why they exist.
I would also run a crawl with Xenu and Screaming Frog to make sure crawlers are finding a reasonable number of pages and that they're not getting stuck in crawl loops. (crawling variations of a page endlessly). These kinds of issues can prevent new pages from being indexed on time because Google is wasting time (your crawl budget) running in circles.
-
Rob,
Your sitemap is but an indication to Google about urls on your domain. The sitemap does not limit google to crawling or indexing only the urls listed on it, nor is it a directive that tells google to remove urls from the index that it has already crawled. As stated in GWT, use **robots.txt **to specify how search engines should crawl your site, or request **removal **of URLs from Google's search results with the URL removal tool Google webmaster tools under the "google index" link.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Sitemap - What are the recommendations on the number of links
Hi, I have a sitemap(s) which is very large(.i.e. 60000) links, is it recommended to have so many links and how come when I do a site search(site:mydomain) the number of links are less than on my site map?
Intermediate & Advanced SEO | | FreddyKgapza0 -
M.ExampleSite vs mobile.ExampleSite vs ExampleSite.com
Hi, I have a call with a potential client tomorrow where all I know is that they are wigged-out about canonicalization, indexing and architecture for their three sites: m.ExampleSite.com mobile.ExampleSite.com ExampleSite.com The sites are pretty large... 350k for the mobiles and 5 million for the main site. They're a retailer with endless products. They're main site is not mobile-responsive, which is evidently why they have the m and mobile sites. Why two, I don't know. This is how they currently hand this: What would you suggest they do about this? The most comprehensive fix would be making the main site mobile responsive and 301 the old mobile sub domains to the main site. That's probably too much work for them. So, what more would you suggest and why? Your thoughts? Best... Mike P.S., Beneath my hand-drawn portrait avatar above it says "Staff" at this moment, which I am not. Some kind of bug I guess.
Intermediate & Advanced SEO | | 945010 -
Sitemap Migration - Google Guidelines
Hi all. I saw in support.google.com the following text: Create and save the Sitemap and lists of links A Sitemap file containing the new URL mapping A Sitemap file containing the old URLs to map A list of sites with link to your current content I would like to better understand about a "A list of sites with bond link to current content" Question 1: have I need tree sitemaps simultaneously ?
Intermediate & Advanced SEO | | mobic
Question 2: If yes, should I put this sitemap on the Search Console of the new website?
Question 3: or just Google gave a about context how do we make the migration? And I'll need really have sitemaps about the new site only..? What about is Google talking? Thanks for any advice.0 -
Thinking about not indexing PDFs on a product page
Our product pages generate a PDF version of the page in a different layout. This is done for 2 reasons, it's been the standard across similar industries and to help customers print them when working with the product. So there is a use when it comes to the customer but search? I've thought about this a lot and my thinking is why index the PDF at all? Only allow the HTML page to be indexed. The PDF files are in a subdomain, so I can easily no index them. The way I see it, I'm reducing duplicate content On the flip side, it is hosted in a subdomain, so the PDF appearing when a HTML page doesn't, is another way of gaining real estate. If it appears with the HTML page, more estate coverage. Anyone else done this? My knowledge tells me this could be a good thing, might even iron out any backlinks from being generated to the PDF and lead to more HTML backlinks Can PDFs solely exist as a form of data accessible once on the page and not relevant to search engines. I find them a bane when they are on a subdomain.
Intermediate & Advanced SEO | | Bio-RadAbs0 -
XML Sitemap Indexation Rate Decrease
On September 28th, 2013 I saw my indexation rate decrease on my XML sitemap that I've submitted through GWT. I've since scraped my sitemap and removed all 404, 400 errors (which only made up ~5% of the entire sitemap). Any idea why Google randomly started indexing less of my XML sitemap on that date? I updated my sitemap 2 week before that date and had an indexation rate of ~85% - no I'm below 35%. Thoughts, idea, experiences? Thanks!
Intermediate & Advanced SEO | | RobbieWilliams0 -
Need a mobile XML Sitemap?
We're going to be running our mobile site on the same domain and generating content for users on mobile devices with style sheets (will not have m.domain). The content on our URLs will be the exact same. My question is if we need to create a mobile XML Sitemap to submit to the search engines. Do we need to create the Sitemap, that will contain the exact same URLs as our non-mobile Sitemap, and just include <mobile><mobile>tags around the URLs? Or do we need to create a mobile Sitemap at all to alert the search engines that we have mobile content?</mobile></mobile> Thanks!
Intermediate & Advanced SEO | | bonnierSEO0 -
Static index page or not?
Are there any advantages of dis-advantages to running a static homepage as opposed to a blog style homepage. I have be running a static page on my site with the latest posts displayed as links after the homepage content. I would like to remove the static page and move to a more visually appealing homepage that includes graphics for each post and the posts droppping down the page like normal blogs do. How will this effect my site if I move from a static page to a more dynamic blog style page layout? Could I still hold the spot I currently rank for with the optimized index content if I turn to a more traditional blog format? cheers,
Intermediate & Advanced SEO | | NoCoGuru0 -
Robots.txt: Link Juice vs. Crawl Budget vs. Content 'Depth'
I run a quality vertical search engine. About 6 months ago we had a problem with our sitemaps, which resulted in most of our pages getting tossed out of Google's index. As part of the response, we put a bunch of robots.txt restrictions in place in our search results to prevent Google from crawling through pagination links and other parameter based variants of our results (sort order, etc). The idea was to 'preserve crawl budget' in order to speed the rate at which Google could get our millions of pages back in the index by focusing attention/resources on the right pages. The pages are back in the index now (and have been for a while), and the restrictions have stayed in place since that time. But, in doing a little SEOMoz reading this morning, I came to wonder whether that approach may now be harming us... http://www.seomoz.org/blog/restricting-robot-access-for-improved-seo
Intermediate & Advanced SEO | | kurus
http://www.seomoz.org/blog/serious-robotstxt-misuse-high-impact-solutions Specifically, I'm concerned that a) we're blocking the flow of link juice and that b) by preventing Google from crawling the full depth of our search results (i.e. pages >1), we may be making our site wrongfully look 'thin'. With respect to b), we've been hit by Panda and have been implementing plenty of changes to improve engagement, eliminate inadvertently low quality pages, etc, but we have yet to find 'the fix'... Thoughts? Kurus0