Clarification on indexation of XML sitemaps within Webmaster Tools
-
Hi Mozzers,
I have a large service based website, which seems to be losing pages within Google's index. Whilst working on the site, I noticed that there are a number of xml sitemaps for each of the services. So I submitted them to webmaster tools last Friday (14th) and when I left they were "pending".
On returning to the office today, they all appear to have been successfully processed on either the 15th or 17th and I can see the following data:
13/08 - Submitted=0 Indexed=0
14/08 - Submitted=606,733 Indexed=122,243
15/08 - Submitted=606,733 Indexed=494,651
16/08 - Submitted=606,733 Indexed=517,527
17/08 - Submitted=606,733 Indexed=517,498Question 1: The indexed pages on 14th of 122,243 - Is this how many pages were previously indexed? Before Google processed the sitemaps? As they were not marked processed until 15th and 17th?
Question 2: The indexed pages are already slipping, I'm working on fixing the site by reducing pages and improving internal structure and content, which I'm hoping will fix the crawling issue. But how often will Google crawl these XML sitemaps?
Thanks in advance for any help.
-
Hi again
This means that because you have multiple sitemaps, Google is going to crawl those at different times possibly and at different rates, hence some of your sitemaps taking a day longer. I really wouldn't look into it too much, and just be assured that Google is crawling your sitemaps fine and indexing.
If you notice major discrepancies in what you submitted and what's being indexed, then I would refer to this Google resource on how to fix issues or errors you find in your sitemap crawl.
Hope this helps! Good luck!
-
Hi there
You submitted on the 13th, there were 0 pages indexed. The next day there were 122,243, so in that time period, Google indexed 122,243 of your site's pages.
This is a day by day process. So whatever new number appears on each day, subtract the previous day's number from your present day number, and that's how many pages were freshly indexed.
Hope this helps! Good luck!
-
Just checked webmaster tools again, and now they (sitemaps) all say processed on 17th and some say 18th (today) does this mean the sitemaps are being processed by Google every couple of days?
-
Hi Patrick,
Thanks for elaborating on question 2.
Question 1, I asked if the number (122,243) was how many pages were in the index** before** google processed the sitemaps, as they don't appear to have been processed until the following day.
You answered, yes but then said its how many pages were processed that day?
Thanks again for your time and clarification.
-
Hi there
Question 1 - Yes, this is how many pages Google indexed from your sitemap on that day.
Question 2 - XML sitemaps allow you to tell Google the change frequency of your URLs - you can learn more about that here. Also, according to Google:
Google's spiders regularly crawl the web to rebuild our index. Crawls are based on many factors such as PageRank, links to a page, and crawling constraints such as the number of parameters in a URL. Any number of factors can affect the crawl frequency of individual sites.
Our crawl process is algorithmic; computer programs determine which sites to crawl, how often, and how many pages to fetch from each site. We don't accept payment to crawl a site more frequently. For tips on maintaining a crawler-friendly website, please visit our Webmaster Guidelines.
Please let me know if you have any further questions or comments.
Hope this helps! Good luck!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Webmaster tools Sitemap submitted vs indexed vs Index Status
I'm having an odd error I'm trying to diagnose. Our Index Status is growing and is now up to 1,115. However when I look at Sitemaps we have 763 submitted but only 134 indexed. The submitted and indexed were virtually the same around 750 until 15 days ago when the indexed dipped dramatically. Additionally when I look under HTML improvements I only find 3 duplicate pages, and I ran screaming frog on the site and got similar results, low duplicates. Our actual content should be around 950 pages counting all the category pages. What's going on here?
Technical SEO | | K-WINTER0 -
Image & Video Sitemaps - Submitted vs. Indexed
Hi Mozzers, I have read all the relevant blogs from media indexing experts like Phil Nottingham and have followed Google's best practice as well as advice from similar discussions on here. We have submitted video and image sitemaps to WT, and the image sitemap has 33 indexed from 720 submitted images, and the video 170 indexed from 738 submitted. With the image sitemap the number (33) has remained steady while the submitted has grown by over 100 in the last month. The video has shown signs of indexing new videos however but still not the amount that were submitted. Thus far, I have followed the guidelines sitemap structure as per Google. We are using Cloudfront so I have added and verified our cloudfront server in the same WT account. If anyone has any advice, it would be most appreciated. There is no duplicate content and the robots.txt is not blocking anything within the sitemap. Image sitemap: view-source:http://www.clowdy.com/sitemap.images.xml
Technical SEO | | Morrreau0 -
Sitemaps
Hi, I have doubt using sitemaps My web page is a news we page and we have thousands of articles in every section. For example we have an area that is called technology We have articles since 1999!! So the question is how can Make googl robot index them? Months ago when you enter the section technology we used to have a paginator without limits, but we notice that this query consume a lot of CPU per user every time was clicked. So we decide to limit to 10 pages with 1 records. Now it works great BUT I can see in google webmaster tools that our index decreased dramatically The answer is very easy, the bot doesn't have a way to get older technoly news articles because we limit he query to 150 records total Well, the Questin is how can I fix this? Options: 1) leave the query without limits 2) create a new button " all tech news" with a different query without a limit but paginated with (for example) 200 records each page 3) Create a sitemap that contain all the tech articles Any idea? Really thanks.
Technical SEO | | informatica8100 -
Setting a geographic target in webmaster tools
If a site is targeting traffic from around the world should I set the geographic targeting in webmaster tools under 'settings' or leave it? Any help would be much appreciated!
Technical SEO | | SamCUK0 -
Webmaster Tools Site Map Question
I have TLD that has authority and a number of micro-sites built off of the primary domain. All sites relate to the same topic, as I am promoting a destination. The primary site and each micro-site have their own CMS installation, but the domains are mapped accordingly. www.regionalsite.com/ <- primary
Technical SEO | | VERBInteractive
www.regioanlsite.com/theme1/ <- theme 1
www.regioanlsite.com/theme2/ <- theme 2
www.regionalsite.com/theme3/ <- theme 3 Question: Should my XML site map for Webmaster Tools feed all sites off of the primary domain site map or are there penalties for this? Thanks.0 -
Webmaster Tools 404s
We try to keep our 404s in google webmaster tools to a minimum but in recent months, the volume has simply exploded to over 500k errors. 99.95% of this is complete spam linking to pages that never existed. Have tried marking as resolved but they just end up back in the list eventually and don't like the idea of 301ing so many links when the pages never existed in the first place. We can just ignore them all but this makes it hard to identify legitimate 404s that need redirecting as there is only so much data we can export out of WT. Has anyone had experience with returning 410s? Does google eventually drop these from WT?
Technical SEO | | jandunlop0 -
Severe Health issue on my site through Webmaster tools
I use Go Daddy Website Tonight. I keep getting a severe health message in Google Webmaster tools stating that my robots.txt file is blocking some important page. When I try to get more details the blocked file will not open. When I asked the Go Daddy peeps they told me that it was just image and backup files that do not need to be crawled. But if Google spiders keep thinking an important page is blocked will this hurt my SERPS?
Technical SEO | | VictorVC0 -
Video Sitemaps - Clarification Needed
I'm trying to make sense of video sitemaps so I can get one up and going but the set up seems unclear. We currently have 7 videos created and up on Youtube. I've got them embedded on the site to a "Video" landing page as well as having these product demo videos embedded on appropriate product detail pages. So when setting up the video sitemap it looks like I'll be using the video:player_loctag as opposed to video:content_locbecause I'm not linking to the file itself but rather a page it's hosted on. Correct? Additionally I'm adding the product detail page url here, not Youtube right? Lastly, do I need to insert an autoplay piece on the videos on the product detail page? I feel that would be an annoying user experience.</video:content_loc></video:player_loc> So part of my sitemap might look like this... <video:player_loc allow_embed="yes" autoplay="ap=[?]">http://website/ProductDetailURL</video:player_loc>
Technical SEO | | dgmiles0