Indexing/Sitemap - I must be wrong
-
Hi All,
I would guess that a great number of us new to SEO (or not) share some simple beliefs in relation to Google indexing and Sitemaps, and as such get confused by what Web master tools shows us.
It would be great if somone with experience/knowledge could clear this up for once and all
Common beliefs:
-
Google will crawl your site from the top down, following each link and recursively repeating the process until it bottoms out/becomes cyclic.
-
A Sitemap can be provided that outlines the definitive structure of the site, and is especially useful for links that may not be easily discovered via crawling.
-
In Google’s webmaster tools in the sitemap section the number of pages indexed shows the number of pages in your sitemap that Google considers to be worthwhile indexing.
-
If you place a rel="canonical" tag on every page pointing to the definitive version you will avoid duplicate content and aid Google in its indexing endeavour.
These preconceptions seem fair, but must be flawed.
Our site has 1,417 pages as listed in our Sitemap. Google’s tools tell us there are no issues with this sitemap but a mere 44 are indexed! We submit 2,716 images (because we create all our own images for products) and a disappointing zero are indexed.
Under Health->Index status in WM tools, we apparently have 4,169 pages indexed. I tend to assume these are old pages that now yield a 404 if they are visited.
It could be that Google’s Indexed quotient of 44 could mean “Pages indexed by virtue of your sitemap, i.e. we didn’t find them by crawling – so thanks for that”, but despite trawling through Google’s help, I don’t really get that feeling.
This is basic stuff, but I suspect a great number of us struggle to understand the disparity between our expectations and what WM Tools yields, and we go on to either ignore an important problem, or waste time on non-issues.
Can anyone shine a light on this for once and all?
If you are interested, our map looks like this :
http://www.1010direct.com/Sitemap.xml
Many thanks
Paul
-
-
44 relates to the number of pages with the same urls as in your sitemap - it is not everything that is index. Your old site is still indexed and being found, as Google visits those pages and gets redirected to a new page it is likely that number will increase (from 44) and the number of old indexed will decrease.
Google doesn't index sites on a one-off go around because then if may take say 4 months to come back and index again and if you've a new important page that gets lots of links and you don't get indexed and ranked for it because you've not been visited you wouldn't be happy. Also if this was done on every site it would take forever and take much more resources than even google has. it is annoying but you've just got to grin and bear it - at least you old site is still ranking and being found.
-
Thanks Andy,
What I dont get, is why Google would index in this way. I can understand why they would weight the importance of a page based on the number/strength of incoming links but not the decision to index it at all when lead in by a sitemap.
I just get a little frustrated when Google offers you seemingly definitive stats only to find they are so vague and mysterious they have little to no value. We should have 1400+ pages indexed, we clearly have more than 44 indexed ... what on earth does the number 44 relate to?
-
I think that as your sitemap reflect your new urls and this is what the index is based on you are likely to have more indexed from what you say. I would suggest going to "indexed status" under health of GWT and click total index and ever crawled, this may help clear this up.
-
I experienced this issue with sandboxed websites.
Market your products and in a few months every page should be in Google's index.
Cheers.
-
Thanks for the quick responses.
We had a bit of a URL reshuffle recently to make them a little more informative and to prevent each page URL terminating with "product.aspx". But that was around a month ago. Prior to that, we were around 40% indexed for pages (from the sitemap section of WM tools), and always zero for images.
So given that we clearly have more than 44 pages indexed by Google, what do you think that figure actually means?
-
dealing with your indexing issue first - depending on when you submitted depends how soon those pages may be indexed. I say "may" because a sitemap (yes answering another question) is just an indicator of "i have these pages" it does not mean they will be indexed - indeed unless you've a small website you will never have 100% indexation in my experience.
Spiders (search robots) index / visit a website / page via another link. They follow links to a page from around the web, or the site itself. The more links from around the web the quicker you will get indexed. (this explains why if you've 10,000 pages you won't ever get a link from other websites to them all and so they won't all get indexed). This means if you've a web page that gets a ton of links it will be indexed sooner than those with just 1 link - assuming all links are equal (which they aren't).
Spiders are not cyclic in their searching, it's very ad-hoc based on links in your site and other sites linking to you. A spider won't be sent to spider every page on your site - it will do a small amount at a time, this is likely why 44 pages are indexed and not more at this point.
A sitemap is (as i say) an indicator of pages in your site, the importance of them and when they were updated / created. it's not really a definitive structure - it's more of a reference guide. Think of it as you being the guide on a bus tour of a city, the search engine is your passenger you are pointing out places of interest and every so often it will see something it wan't to see and get off to look, but it may take many trips to get off at every stop.
Finally, Canonicals are a great way to clear up duplicate content issues. They aren't 100% successful but they do help - especially if you are using dynamic urls (such as paginating category pages).
hope that helps
-
I see your frustration, how long ago did you submit these site maps? Are we talking a couple of weeks or a couple of days/ a day? As I've seen myself, Google is not that fast at calculating the nr of pages indexed (definitely not within GWT). Mostly within a couple of days/ within a week Google largely increased the nr of pages indexed.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
[Very Urgent] More 100 "/search/adult-site-keywords" Crawl errors under Search Console
I just opened my G Search Console and was shocked to see more than 150 Not Found errors under Crawl errors. Mine is a Wordpress site (it's consistently updated too): Here's how they show up: Example 1: URL: www.example.com/search/adult-site-keyword/page2.html/feed/rss2 Linked From: http://an-adult-image-hosting.com/search/adult-site-keyword/page2.html Example 2 (this surprised me the most when I looked at the linked from data): URL: www.example.com/search/adult-site-keyword-2.html/page/3/ Linked From: www.example.com/search/adult-site-keyword-2.html/page/2/ (this is showing as if it's from our own site) http://a-spammy-adult-site.com/search/adult-site-keyword-2.html Example 3: URL: www.example.com/search/adult-site-keyword-3.html Linked From: http://an-adult-image-hosting.com/search/adult-site-keyword-3.html How do I address this issue?
Intermediate & Advanced SEO | | rmehta10 -
Duplicate content on .com .au and .de/europe/en. Would it be wise to move to .com?
This is the scenario: A webstore has evolved into 7 sites in 3 shops: example.com/northamerica example.de/europe example.de/europe/en example.de/europe/fr example.de/europe/es example.de/europe /it example.com.au .com/northamerica .de/europe/en and .com.au all have mostly the same content on them (all 3 are in english). What would be the best way to avoid duplicate content? An answer would be very much appreciated!
Intermediate & Advanced SEO | | SEO-Bas0 -
Difference in Number of URLS in "Crawl, Sitemaps" & "Index Status" in Webmaster Tools, NORMAL?
Greetings MOZ Community: Webmaster Tools under "Index Status" shows 850 URLs indexed for our website (www.nyc-officespace-leader.com). The number of URLs indexed jumped by around 175 around June 10th, shortly after we launched a new version of our website. No new URLs were added to the site upgrade. Under Webmaster Tools under "Crawl, Site maps", it shows 637 pages submitted and 599 indexed. Prior to June 6th there was not a significant difference in the number of pages shown between the "Index Status" and "Crawl. Site Maps". Now there is a differential of 175. The 850 URLs in "Index Status" is equal to the number of URLs in the MOZ domain crawl report I ran yesterday. Since this differential developed, ranking has declined sharply. Perhaps I am hit by the new version of Panda, but Google indexing junk pages (if that is in fact happening) could have something to do with it. Is this differential between the number of URLs shown in "Index Status" and "Crawl, Sitemaps" normal? I am attaching Images of the two screens from Webmaster Tools as well as the MOZ crawl to illustrate what has occurred. My developer seems stumped by this. He has submitted a removal request for the 175 URLs to Google, but they remain in the index. Any suggestions? Thanks,
Intermediate & Advanced SEO | | Kingalan1
Alan0 -
Home page not being indexed
Hi Moz crew. I have two sites (one is a client's and one is mine). They are both Wordpress sites and both are hosted on WP Engine. They have both been set up for a long time, and are "on-page" optimized. Pages from each site are indexed, but Google is not indexing the homepage for either site. Just to be clear - I can set up and work on a Wordpress site, but am not a programmer. Both seem to be fine according to my Moz dashboard. I have Webmaster tools set up for each - and as far as I can tell (definitely not an exper in webmaster tools) they are okay. I have done the obvious and checked that the the box preventing Google from crawling is not checked, and I believe I have set up the proper re-directs and canonicals.Thanks in advance! Brent
Intermediate & Advanced SEO | | EchelonSEO0 -
Why extreme drop in number of pages indexed via GWMT sitemaps?
Any tips on why our GWMT Sitemaps indexed pages dropped to 27% of total submitted entries (2290 pages submitted, 622 indexed)? Already checked the obvious Test Sitemap, valid URLs etc. We had typically been at 95% of submitted getting indexed.
Intermediate & Advanced SEO | | jkinnisch0 -
To index or de-index internal search results pages?
Hi there. My client uses a CMS/E-Commerce platform that is automatically set up to index every single internal search results page on search engines. This was supposedly built as an "SEO Friendly" feature in the sense that it creates hundreds of new indexed pages to send to search engines that reflect various terminology used by existing visitors of the site. In many cases, these pages have proven to outperform our optimized static pages, but there are multiple issues with them: The CMS does not allow us to add any static content to these pages, including titles, headers, metas, or copy on the page The query typed in by the site visitor always becomes part of the Title tag / Meta description on Google. If the customer's internal search query contains any less than ideal terminology that we wouldn't want other users to see, their phrasing is out there for the whole world to see, causing lots and lots of ugly terminology floating around on Google that we can't affect. I am scared to do a blanket de-indexation of all /search/ results pages because we would lose the majority of our rankings and traffic in the short term, while trying to improve the ranks of our optimized static pages. The ideal is to really move up our static pages in Google's index, and when their performance is strong enough, to de-index all of the internal search results pages - but for some reason Google keeps choosing the internal search results page as the "better" page to rank for our targeted keywords. Can anyone advise? Has anyone been in a similar situation? Thanks!
Intermediate & Advanced SEO | | FPD_NYC0 -
Google Sitemap only indexing 50% Is that a problem?
We have about 18,000 pages submitted on our Google Sitemap and only about 9000 of them are indexed. Is this a problem? We have a script that creates a sitemap on a daily basis and it is submitted on a daily basis. Am I better off only doing it once a week? Is this why I never get to the full 18,000 indexed?
Intermediate & Advanced SEO | | EcommerceSite0 -
De-indexed by Google! ?
So it looks as though the content from myprgenie.com is no longer being indexed. Anyone know what happened and what they can do to fix it fast?
Intermediate & Advanced SEO | | siteoptimized0