Sitemap - % of URL's in Google Index?
-
What is the average % of links from a sitemap that are included in the Google index? Obviously want to aim for 100% of the sitemap urls to be indexed, is this realistic?
-
If all the pages in your sitemap are worthy of the Google index, then you should expect around a 100% indexation rate. On the flip side, if you reference low quality pages in your sitemap file, you will not got them indexed and may even be hurting the trust of your sitemap file. As a point in case, Bing just recently announced that if they see an error rate greater than 1% in the sitemap, then they will just ignore your sitemap file.
-
Clients, so I have no idea how they do it. It's a complex automated process for sure.
-
Wow. Do you have a third party program to build your site map files or our you using something built in house?
-
Ryan's point is important to note. 100% is achievable under the correct circumstances. I've got a client with 34 million pages on their main site (and contained within a combined 909 sitemap xml files), and they have 34 million pages indexed.
-
The percent of pages indexed varies greatly with each site. If you desire 100% of your site indexed then 100% of your site's pages should be reviewed to ensure their content is worthy of being indexed. The content should be unique, well written and properly presented. Your sitemap process also needs to be carefully reviewed. Many site owners simply set up an automated process without taking the time to ensure it is properly configured. Often pages which are blocked by robots.txt are included in the site map, and those pages will not be indexed.
Many people say "I want 100% of my site indexed" just how many people say "I want to be #1 rank in Google". Both results are achievable, but both require time and effort, and perhaps money.
-
Hi. We have a stiemap with over 250,000 URLs and we are at 87%. This is a high for us. We have never been able to get 100%. We have been trying to clean up the sitemap a bit but with so many URLs it is hard to go through it line by line. We are making more of an effort to fix the errors Google tells us about in Webmaster Tools but these only account for a fraction of the URLs apparently not indexed.
We also do site searches on Google to see how many URLs total we have in Google as our sitemap only includes "the most important" pages. Doing a search for "site:www.sierratradingpost.com" comes up with over 400,000 URLs.
For us, I don't think 100% is realistic. We have never been able to achieve it. It will be interesting to see what other SEOmozers have to report!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Should I submit an additional sitemap to speed up indexing
Hi all, Wondered if there was any wisdom on this that anyone could impart my way? I'm moving a set of pages from one area of the site to another - to bring them up the folder structure, and so they generally make more sense. Our URLs are very long in some cases, so this ought to help with some rationalisation there too. We will have redirects in place, but the pages I'm moving are important and I'd like the new paths to be indexed as soon as possible. In such an instance, can I submit an additional sitemap with just these URLs to get them indexed quicker (or to reaffirm that indexing from the initial parse)? The site is thousands of pages. Any benefits / disadvantages anyone could think of? Any thoughts very gratefully received.
Intermediate & Advanced SEO | | ceecee0 -
How can I make a list of all URLs indexed by Google?
I started working for this eCommerce site 2 months ago, and my SEO site audit revealed a massive spider trap. The site should have been 3500-ish pages, but Google has over 30K pages in its index. I'm trying to find a effective way of making a list of all URLs indexed by Google. Anyone? (I basically want to build a sitemap with all the indexed spider trap URLs, then set up 301 on those, then ping Google with the "defective" sitemap so they can see what the site really looks like and remove those URLs, shrinking the site back to around 3500 pages)
Intermediate & Advanced SEO | | Bryggselv.no0 -
URL Parameter Being Improperly Crawled & Indexed by Google
Hi All, We just discovered that Google is indexing a subset of our URL’s embedded with our analytics tracking parameter. For the search “dresses” we are appearing in position 11 (page 2, rank 1) with the following URL: www.anthropologie.com/anthro/category/dresses/clothes-dresses.jsp?cm_mmc=Email--Anthro_12--070612_Dress_Anthro-_-shop You’ll note that “cm_mmc=Email” is appended. This is causing our analytics (CoreMetrics) to mis-attribute this traffic and revenue to Email vs. SEO. A few questions: 1) Why is this happening? This is an email from June 2012 and we don’t have an email specific landing page embedded with this parameter. Somehow Google found and indexed this page with these tracking parameters. Has anyone else seen something similar happening?
Intermediate & Advanced SEO | | kevin_reyes
2) What is the recommended method of “politely” telling Google to index the version without the tracking parameters? Some thoughts on this:
a. Implement a self-referencing canonical on the page.
- This is done, but we have some technical issues with the canonical due to our ecommerce platform (ATG). Even though page source code looks correct, Googlebot is seeing the canonical with a JSession ID.
b. Resubmit both URL’s in WMT Fetch feature hoping that Google recognizes the canonical.
- We did this, but given the canonical issue it won’t be effective until we can fix it.
c. URL handling change in WMT
- We made this change, but it didn’t seem to fix the problem
d. 301 or No Index the version with the email tracking parameters
- This seems drastic and I’m concerned that we’d lose ranking on this very strategic keyword Thoughts? Thanks in advance, Kevin0 -
Removing Dynamic "noindex" URL's from Index
6 months ago my clients site was overhauled and the user generated searches had an index tag on them. I switched that to noindex but didn't get it fast enough to avoid being 100's of pages indexed in Google. It's been months since switching to the noindex tag and the pages are still indexed. What would you recommend? Google crawls my site daily - but never the pages that I want removed from the index. I am trying to avoid submitting hundreds of these dynamic URL's to the removal tool in webmaster tools. Suggestions?
Intermediate & Advanced SEO | | BeTheBoss0 -
Why are our sites top landing pages URL's that no longer exist and retrun 404 errors?
Digging through analytics today an noticed that our sites top landing pages are for pages that were part of the old www.towelsrus.co.uk website taken down almost 12 months ago. All these pages had the 301 re-directs which were removed a few months back but still have not dropped out of Googles crawl error logs. I can't understand why this is happening but almost certainly the bounce rate on these pages (100%) mean we are loosing potential conversions. How can I identify what keywords and links people are using to land on these pages?
Intermediate & Advanced SEO | | Towelsrus0 -
How to determine URL Parameters in Google Webmaster
Hi there! I have a new website with so many duplicate meta titles and descriptions because of its expanded features from the e-commerce shopping cart that I am using like mobile website, product sorting, etc. Aside from canonical, is it advisable to use the URL parameters from Google webmaster tools to disallow crawling of mobile website and other parameters like, "parent", "catalogsetview", "pcsid", "pg" "mode". I appreciate and advise. 🙂 Thanks!
Intermediate & Advanced SEO | | paumer800 -
What's the best .NET blog solution?
I asked our developers to implement a WordPress blog on our site and they feel that the technology stack that is required to support WP will interfere with a number of different .NET production applications on that server. I can't justify another server just because of a blog either. They want me to find a .NET blog solution. The only thing that looks decent out there is dotnetblogengine.net. Has anyone had any experience with this tool or any others like it? Thanks, Alex
Intermediate & Advanced SEO | | dbuckles1 -
Questions regarding Google's "improved url handling parameters"
Google recently posted about improving url handling parameters http://googlewebmastercentral.blogspot.com/2011/07/improved-handling-of-urls-with.html I have a couple questions: Is it better to canonicalize urls or use parameter handling? Will Google inform us if it finds a parameter issue? Or, should we have a prepare a list of parameters that should be addressed?
Intermediate & Advanced SEO | | nicole.healthline0