Indexing/Sitemap - I must be wrong
-
Hi All,
I would guess that a great number of us new to SEO (or not) share some simple beliefs in relation to Google indexing and Sitemaps, and as such get confused by what Web master tools shows us.
It would be great if somone with experience/knowledge could clear this up for once and all
Common beliefs:
-
Google will crawl your site from the top down, following each link and recursively repeating the process until it bottoms out/becomes cyclic.
-
A Sitemap can be provided that outlines the definitive structure of the site, and is especially useful for links that may not be easily discovered via crawling.
-
In Google’s webmaster tools in the sitemap section the number of pages indexed shows the number of pages in your sitemap that Google considers to be worthwhile indexing.
-
If you place a rel="canonical" tag on every page pointing to the definitive version you will avoid duplicate content and aid Google in its indexing endeavour.
These preconceptions seem fair, but must be flawed.
Our site has 1,417 pages as listed in our Sitemap. Google’s tools tell us there are no issues with this sitemap but a mere 44 are indexed! We submit 2,716 images (because we create all our own images for products) and a disappointing zero are indexed.
Under Health->Index status in WM tools, we apparently have 4,169 pages indexed. I tend to assume these are old pages that now yield a 404 if they are visited.
It could be that Google’s Indexed quotient of 44 could mean “Pages indexed by virtue of your sitemap, i.e. we didn’t find them by crawling – so thanks for that”, but despite trawling through Google’s help, I don’t really get that feeling.
This is basic stuff, but I suspect a great number of us struggle to understand the disparity between our expectations and what WM Tools yields, and we go on to either ignore an important problem, or waste time on non-issues.
Can anyone shine a light on this for once and all?
If you are interested, our map looks like this :
http://www.1010direct.com/Sitemap.xml
Many thanks
Paul
-
-
44 relates to the number of pages with the same urls as in your sitemap - it is not everything that is index. Your old site is still indexed and being found, as Google visits those pages and gets redirected to a new page it is likely that number will increase (from 44) and the number of old indexed will decrease.
Google doesn't index sites on a one-off go around because then if may take say 4 months to come back and index again and if you've a new important page that gets lots of links and you don't get indexed and ranked for it because you've not been visited you wouldn't be happy. Also if this was done on every site it would take forever and take much more resources than even google has. it is annoying but you've just got to grin and bear it - at least you old site is still ranking and being found.
-
Thanks Andy,
What I dont get, is why Google would index in this way. I can understand why they would weight the importance of a page based on the number/strength of incoming links but not the decision to index it at all when lead in by a sitemap.
I just get a little frustrated when Google offers you seemingly definitive stats only to find they are so vague and mysterious they have little to no value. We should have 1400+ pages indexed, we clearly have more than 44 indexed ... what on earth does the number 44 relate to?
-
I think that as your sitemap reflect your new urls and this is what the index is based on you are likely to have more indexed from what you say. I would suggest going to "indexed status" under health of GWT and click total index and ever crawled, this may help clear this up.
-
I experienced this issue with sandboxed websites.
Market your products and in a few months every page should be in Google's index.
Cheers.
-
Thanks for the quick responses.
We had a bit of a URL reshuffle recently to make them a little more informative and to prevent each page URL terminating with "product.aspx". But that was around a month ago. Prior to that, we were around 40% indexed for pages (from the sitemap section of WM tools), and always zero for images.
So given that we clearly have more than 44 pages indexed by Google, what do you think that figure actually means?
-
dealing with your indexing issue first - depending on when you submitted depends how soon those pages may be indexed. I say "may" because a sitemap (yes answering another question) is just an indicator of "i have these pages" it does not mean they will be indexed - indeed unless you've a small website you will never have 100% indexation in my experience.
Spiders (search robots) index / visit a website / page via another link. They follow links to a page from around the web, or the site itself. The more links from around the web the quicker you will get indexed. (this explains why if you've 10,000 pages you won't ever get a link from other websites to them all and so they won't all get indexed). This means if you've a web page that gets a ton of links it will be indexed sooner than those with just 1 link - assuming all links are equal (which they aren't).
Spiders are not cyclic in their searching, it's very ad-hoc based on links in your site and other sites linking to you. A spider won't be sent to spider every page on your site - it will do a small amount at a time, this is likely why 44 pages are indexed and not more at this point.
A sitemap is (as i say) an indicator of pages in your site, the importance of them and when they were updated / created. it's not really a definitive structure - it's more of a reference guide. Think of it as you being the guide on a bus tour of a city, the search engine is your passenger you are pointing out places of interest and every so often it will see something it wan't to see and get off to look, but it may take many trips to get off at every stop.
Finally, Canonicals are a great way to clear up duplicate content issues. They aren't 100% successful but they do help - especially if you are using dynamic urls (such as paginating category pages).
hope that helps
-
I see your frustration, how long ago did you submit these site maps? Are we talking a couple of weeks or a couple of days/ a day? As I've seen myself, Google is not that fast at calculating the nr of pages indexed (definitely not within GWT). Mostly within a couple of days/ within a week Google largely increased the nr of pages indexed.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Indexing is live what about rankings ?
I noticed that when I request indexing in the webmaster tool my new content is live within minutes. Does it take longer to update the ranking or is the ranking updated as soon as the new page has been indexed. Thank you,
Intermediate & Advanced SEO | | seoanalytics0 -
Removing pages from index
My client is running 4 websites on ModX CMS and using the same database for all the sites. Roger has discovered that one of the sites has 2050 302 redirects pointing to the clients other sites. The Sitemap for the site in question includes 860 pages. Google Webmaster Tools has indexed 540 pages. Roger has discovered 5200 pages and a Site: query of Google reveals 7200 pages. Diving into the SERP results many of the pages indexed are pointing to the other 3 sites. I believe there is a configuration problem with the site because the other sites when crawled do not have a huge volume of redirects. My concern is how can we remove from Google's index the 2050 pages that are redirecting to the other sites via a 302 redirect?
Intermediate & Advanced SEO | | tinbum0 -
DNS Settings went wrong....
Hi, I'm going to have to give you a little bit of a back story here... In July last year we launched a brand new website, www.turnkeylandlords.co.uk. It was on a new domain. The IT department set it live, and unfortunately messed up the DNS settings so that the site was launched under the wrong domain, smartloan.co.uk. This error was rectified within hours. Unfortunately in those few hours, Google indexed it! I then had to set up webmaster tools for both domains, so I could use the 'remove URLs' tool in there, to remove all the URLs from the smartloan domain. That all worked fine, the Landlords site was probably set back a bit, but we're now achieving some quite good results for it. 2 weeks ago we launched Smartloan as a product, and of course launched the website we've been working on for months... You guessed it! Google is now looking for all those old Landlords pages under Smartloan. My first thought is that we should do 301s. Would that be the best course of action, do you think? Webmaster tools has found 25 of them so far, but I know there are more - the Landlords site launched with about 90 pages... And where should I send the 301's? To the Landlords site, or to the smartloan root? Is there anything else I should do? Thanks for your help! Amelia
Intermediate & Advanced SEO | | CommT0 -
Mobile Sitemap Issue
Hi there, I am having some difficulty with an error on Webmaster Tools. I'm concerned with a possible duplicate content penalty following the launch of my mobile site. I have attempted to update my sitemap to inform Google that a different mobile page exists in addition to the desktop page. I have followed Google's guidelines as outlined here:
Intermediate & Advanced SEO | | DBC01
http://support.google.com/webmasters/bin/answer.py?hl=en&answer=34648 I'm having problems with my sitemap.xml file. Webmaster tools is reporting that it is not able to read the file and when I validate it I am getting an error stating that the 'Namespace prefix xhtml on link is not defined'. All I am trying to do is to create a sitemap that uses the rel="alternate" to inform Google that their is a mobile version of that specific page in addition to the desktop version. An instance of the code I am using is below: xml version="1.0" encoding="UTF-8"?> xml-stylesheet type="text/xsl" href="gss.xsl"?> <urlset< span="">xmlns="http://www.sitemaps.org/schemas/sitemap/0.9"xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"xsi:schemaLocation="http://www.google.com/schemas/sitemap/0.84 http://www.sitemaps.org/schemas/sitemap/0.9/sitemap.xsd"> http://www.mydomain/info/detail/ <xhtml:link< span="">rel="alternate" media="only screen and (max-width: 640px)" href="http://m.mydomain.com/info/detail.html"/> <lastmod></lastmod>2013-02-01T16:03:48+00:00<changefreq></changefreq>daily0.50</xhtml:link<></urlset<> Any help would be much appreciated. Thanks0 -
How accurate are the index figures in GWT?
I've been looking at a site in GWT and the number of indexed urls is very low when compared with the number or submitted urls on the xml sitemaps. The site has several stores which are all submitted using different sitemaps. When you perform a search in Google, eg site:domain.com/store1 site:domain.com/store2 site:domain.com/store3 The results are similar to the webmaster urls. However, looking in the analytics for landing pages used for organic traffic from Google shows a much higher number of pages. If these pages aren't indexed as reported in GMT, how could they be found in the results and be recorded as landing pages?
Intermediate & Advanced SEO | | edwardlewis0 -
Software to monitor indexed pages
Dear SEO moz, As a SEO marketer on a pretty big website I noticed a HUGE amount of dropping pages indexed by google. We did not do anything to block googleblot in the past 6 months, but since November the number of indexed pages decreased from 3.4 milion (3,400.000) to 7 hundred thousand (700,000). Obviously I want to know which pages are de-indexed. Does anyone you know a tool which can do this?
Intermediate & Advanced SEO | | JorisHas1 -
Could a HTML <select>with large numbers of <option value="<url>">'s affect my organic rankings</option></select>
Hi there, I'm currently redesigning my website, and one particular pages lists hotels in New York. Some functionality I'm thinking of adding in is to let the user find hotels close to specific concert venues in New York. My current thinking is to provide the following select element on the page - selecting any one of the options will automatically redirect to my page for that concert venue. The purpose of this isn't to affect the organic traffic - I'm simply introducing this as a tool to help customers find the right hotel, but I certainly don't want it to have an adverse effect on my organic traffic. I'd love to know your thoughts on this. I must add that in certain cities, such as New York, there could be up to 450 different options in this select element. | <select onchange="location=options[selectedIndex].value;"> <option value="">Show convenient hotels for:</option> <option value="http://url1..">1492 New York</option> <option value="http://url2..">Abrons Arts Center</option> <option value="http://url3..">Ace of Clubs New York</option> <option value="http://url4..">Affairs Afloat</option> <option value="http://url5..">Affirmation Arts New York</option> <option value="http://url6..">Al Hirschfeld Theatre</option> <option value="http://url7..">Alice Tully Hall</option> .. .. ..</select> Many thanks Mike |
Intermediate & Advanced SEO | | mjk260 -
Blog/Shop/Forum site structure - are we right to make these changes?
We run a fairly large online community with a popular blog and Europe's largest online shop for drift-specific motor sport parts and our website has been around since 2004 I believe. Since it was launched, the blog (or previous CMS system) has been at the domain root, the forums have been located at /forum and the shop at /shop (or similar) but we have decided to move things around a bit and would like some comments as to whether we are doing the right thing or if you would make any addition or different changes to us. Currently the entire website gets around 3m page views per month from 500,000 visitors, but this is split roughly 75% to the forums, 10% to the shop and 15% to the blog (but remember the blog is at the root so anyone who visits our homepage "visits" the blog). We plan to move the shop to the domain root (since the shop provides the income for the business - surely it should be the 1st thing visitors see?), the blog from root to /blog and the forums will stay where they are at /forum. We have read Steven Macdonald's post here, and have taken notes to help minimize traffic loss and disruption to our army of users and hopefully avoid too many penalties from Google and plan to: 301 redirect old URLs to new ones where they have changed. Submit new site maps to search engines. Update old links where we have control (such as forums where we are paid traders etc.). Send out a newsletter to our subscribers. Update our forum members. Fix errors via WMT before and after the re-structure. Should we be taking this opportunity to actually set each of the three sections of the site to it's own sub domain? Our thoughts are that if we are disrupting things, it's surely best to have lots of disruption once rather than a little bit of disruption several times over a 3-6 month period? OSE shows us to have roughly 1500 inbound links to /shop, 2100 to /forum and 4800 to the root / - if we proceed with our plan and put 301 redirects in place this seems to be the best plan to retain the value of these links but if we were to switch to sub domains would the 301s lose most of the link values due to them being on "different" domains? Any help, advise or suggestions are very welcome but comments from experience are what we are seeking ideally! Thanks Jay
Intermediate & Advanced SEO | | DWJames0