Total Indexed 1.5M vs 83k submitted by sitemap. What?
-
We recently took a good look at one of our content site's sitemap and tried to cut out a lot of crap that had gotten in there such as .php, .xml, .htm versions of each page. We also cut out images to put in a separate image sitemap.
The sitemap generated 83,000+ URLs for google to crawl (this partially used the Yoast Wordpress plugin to generate)
In webmaster tools in the index status section is showing that this site has a total index of 1.5 million.
With our sitemap coming back with 83k and google indexing 1.5 million pages, is this a sign of a CMS gone rogue? Is it an indication that we could be pumping out error pages or empty templates, or junk pages that we're cramming into Google's bot?
I would love to hear what you guys think. Is this normal? Is this something to be concerned about? Should our total index more closely match our sitemap page count?
-
As well as parameters mentioned you may possibly have heaps of duplicating categories, tags etc. What I would also do is start searching Google with something like site:www.example.com/directory/ or possibly site:www.example.com/category/directory/directory/ so you are tightly narrowing down the results, switch to 100 results per page and manually look for clues.
-
If you have 1.5 million pages and you think your sitemap is comprehensive at 83,000 then yes, your CMS is needlessly generating pages. It's usually not a big deal from a ranking standpoint, but it can make other important issues hard to detect. I would clean it up, but that's a business call you'll have to make.
The first step is diagnosing where are the URLs are coming from. What you do next will depend, but I will give you the best advice I can without knowing what types of extraneous URLs you have and how Google is treating them:
First, I'd start with WMT > Crawl > URL Parameters. Quite often your CMS will generate URLs, and Google usually knows how to handle them. If there are a lot of URL parameters, Google them and see if they're exactly the same as other pages. If they are, make sure you have canonical tags in place to point them to the main version. There's more you can do with parameters, but it'll depend on what you find so I won't go into more detail. As a general rule, though, a CMS should not generate a page unless it is uniquely useful as differentiated landing page or a page for people to link to.
Also check for parameters in your analytics program. They could actually be messing up your pageview data depending on how you report.There's a post on fixing that in GA here:
http://blog.crazyegg.com/2013/03/29/remove-url-parameters-from-google-analytics-reports/
Next I'd look at the "Advanced" tab in WMT > Google Index > Index Status . Are there a lot of URLs removed? If so, check on these pages and see why they're removed and why they exist.
I would also run a crawl with Xenu and Screaming Frog to make sure crawlers are finding a reasonable number of pages and that they're not getting stuck in crawl loops. (crawling variations of a page endlessly). These kinds of issues can prevent new pages from being indexed on time because Google is wasting time (your crawl budget) running in circles.
-
Rob,
Your sitemap is but an indication to Google about urls on your domain. The sitemap does not limit google to crawling or indexing only the urls listed on it, nor is it a directive that tells google to remove urls from the index that it has already crawled. As stated in GWT, use **robots.txt **to specify how search engines should crawl your site, or request **removal **of URLs from Google's search results with the URL removal tool Google webmaster tools under the "google index" link.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Absolute vs. Relative Canonical Links
Hi Moz Community, I have a client using relative links for their canonicals (vs. absolute) Google appears to be following this just fine, but bing, etc. are still sending organic traffic to the non-canonical links. It's a drupal setup. Anyone have advice? Should I recommend that all canonical links be absolute? They are strapped for resources, so this would be a PITA if it won't make a difference. Thanks
Intermediate & Advanced SEO | | SimpleSearch1 -
Location.href vs href?
I just got off a Google Hangout with John Mueller and was left a little confused about his response to my question. If I have an internal link in a div like widgetwill it have the same SEO impact as widget John said that as you are unable to attribute a nofollow in an onclick event it would be treated as a naked link and would not pass pagerank but still be crawled. Can anyone confirm that I understood it correctly? If so should all my links that have such an onclickevent also have an html ahref in the too? Such as widget Many times it is more useful for the customer to click on any area of a large div and not just the link to get to the destination intended? Clarification on this subject would be very useful, there is nothing easily found online to confirm this. Thanks
Intermediate & Advanced SEO | | gazzerman10 -
Defining Canonical First and Later No Indexing
We found some repetitive pages on site which has mostly sort or filter parameters, tried lot to remove them but nothing much improvement Is it correct way that:- a) We are creating new pages altogther of that section and putting up rel canonical tag from old ones to new ones b) Now, after canonical declared, we will noindex the old pages Is it a correct way to let new pages supercede the old pages with new pages.
Intermediate & Advanced SEO | | Modi0 -
Correct Syntax for Meta No Index
Hi, Is this syntax ok for the bots, especially Google to pick up? Still waiting for Google to drop lots of duplicate pages caused by parameters out of the index - if there are parameters in the querystring it inserts the code above into the head. Thanks!
Intermediate & Advanced SEO | | bjs20101 -
Consolidate 150 domains to 1
Hi! Just as the questions tell we are looking at a project where we might have to consolidate 150 different domains into 1 (of course with a corresponding page on the new domain). We aim at preserving as much of the linkjuice as possible from each domain. Any advice on doing this propely? I, of course, see a risk of opening the new domain and just redirecting (301) the old domains to the specific page on the new domain but is there any right or wrong way of doing this? I might add that each domain has a more or less unique linkprofiles in terms om linking domains, number of linking domains and such. Our dear friend Cutts has some information on this topic, http://www.youtube.com/watch?v=l7M22teF3Ho but he only talks about 4 domains - which of course seem like a bit more natural occurring phenomenon. But what about 150 of them? Anyone got any advice? Is this as much of a no-go that I feel it is? Thanks! Edit: There domains are all owned by the same entitiy, share the same GWT and such.
Intermediate & Advanced SEO | | bebetteronline0 -
404 with a Javascript Redirect to the index page...
I have a client that is wanting me to issue a 404 on her links that are no longer valid to a custom 404, pause for 10 seconds, then rediirect to the root page (or whatever other redirect logic she wants)...to me it seems trying to game googlebot this way is a "bad idea" Can anyone confirm/deny or offer up a better suggestion?
Intermediate & Advanced SEO | | JusinDuff0 -
Google +1 and Yslow
After adding Google's +1 script and call to our site (loading asynchronously), we noticed Yslow is giving us a D for not having expire headers for the following scripts: https://apis.google.com/js/plusone.js
Intermediate & Advanced SEO | | GKLA
https://www.google-analytics.com/ga.js
https://lh4.googleusercontent.com... 1. Is their a workaround for this issue, so expire headers are added to to plusone and GA script? Or, are we being to nit-picky about this issue?0 -
Why are so many pages indexed?
We recently launched a new website and it doesn't consist of that many pages. When you do a "site:" search on Google, it shows 1,950 results. Obviously we don't want this to be happening. I have a feeling it's effecting our rankings. Is this just a straight up robots.txt problem? We addressed that a while ago and the number of results aren't going down. It's very possible that we still have it implemented incorrectly. What are we doing wrong and how do we start getting pages "un-indexed"?
Intermediate & Advanced SEO | | MichaelWeisbaum0