Google Webmaster tools: Sitemap.xml not processed everyday
-
Hi,
We have multiple sites under our google webmaster tools account with each having a sitemap.xml submitted
Each site's sitemap.xml status ( attached below ) shows it is processed everyday except for one
_Sitemap: /sitemap.xml__This Sitemap was submitted Jan 10, 2012, and processed Oct 14, 2013._But except for one site ( coed.com ) for which the sitemap.xml was processed only on the day it is submitted and we have to manually resubmit every day to get it processed.Any idea on why it might?thank you
-
My initial reaction was that this is more likely technical than something Google is doing - checking the load-time is a good idea. Make sure the sitemap validates and there's nothing odd about it. If you manually re-submit it, does it seem to take?
-
Just so I am clear, you have been waiting and finally Google processed it, or was it sitting there and someone took an action which caused Google to process it?
I am surprised that nothing happened for nearly two years. Has the site had traffic, etc.? Any warnings, manual actions, etc.?
Thanks
-
Thanks Paul,
On custom crawl setting, I just verified and they remain same same across all our sites.
Yes, sitemap is dynamic but rendered via cache which refreshed when new content get published, will check on the load time.
thank you
-
Thanks Robert,
All our sites are getting indexed for sure, but one site ( coed.com ) sitemap.xml on GWT says it was processed only on the day it submitted, while other sites ( collegecandy.com and bustedcoverage.com ) sitemap.xml GWT was getting processed everyday
Our sitemap.xml on all sites updates automatically when new content get published so I believe it has to be processed every day
-
Any chance there's been a custom crawl setting accidentally added to your Google Webmaster Tools, Robert? Some devs do this during development or it can happen accidentally.
Also, your sitmemap takes a ridiculously long time to load - well over 15 seconds for me and over 18 seconds using webpagetest.org. It could be that Google simply isn't waiting for the page to load when it tries to visit. If the sitemap's being generated dynamically, you may have a rendering problem. Otherwise there's something borked when a 50kb file takes that long.
Might also want to try submitting it through Bing Webmaster Tools and see if they are better able to index it consistently for comparison?
Bit of a head-scratcher. Hope that gives you a starting point.
Paul
-
Robert,
There is no screenshot attached, but I am unaware of sitemaps being processed daily by search engines. What are you trying to achieve by continuously resubmitting the sitemap?
The site is indexed, correct? And when you look at crawl stats it is showing the site being crawled on some semi regular basis? Google does not process your sitemap every day.
Hope that helps,
Robert
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
General questions about implementing hreflang using XML sitemap
I created another thread regarding hreflang sitemaps. However, this one is more general and doesn't cover multiple sitemaps for different localizations so I think it's reasonable creating a new thread. We are trying to implement hreflang using XML sitemap. We have localized content for a few countries, but only 1/3 of the content is 'duplicate' localized content. How should this be presented in the sitemap? Can we have some urls with hreflang-tags and some without? Also, where should this be located? In the usual sitemap file at site.com/sitemap.xml or should we create a different sitemap site.com/hreflang.xml where we just paste all hreflang-info? And if it should be in /hreflang.xml - can we have the same URL twice (in both current sitemap and hreflang sitemap)?
Technical SEO | | Telsenome0 -
Top Landing Page has disappeared from Google Search Console but still shows at the top of Google
Hi, One of the top landing pages in my website has disappeared from GSC Search Analytics results. But I do get a good traffic from that page until now. What might be the reason for GSC to stop showing it on the results?
Technical SEO | | eduapps0 -
Sitemap
I have a question for the links in a sitemap. Wordpress works with a sitemap that first link to the different kind of pages: pagesitemap.xml categorysitemap.xml productsitemap.xml etc. etc. These links on the first page are clickable. We have a website that also links to the different pages but it's not clickable, just a flat link. Is this an issue?
Technical SEO | | Happy-SEO0 -
Google Update Frequency
Hi, I recently found a large number of duplicate pages on our site that we didn't know existed (our third-party review provider was creating a separate page for each product whether it was reviewed or not - the ones not reviewed are almost identical so they have been no indexed. Question - how long do you have to typically wait for Google to pick this up On our site? Is it a normal crawl or do we need to wait for the next Panda review (if there is such a thing)? Thanks much.
Technical SEO | | trophycentraltrophiesandawards0 -
Sitemaps
Hi, I have doubt using sitemaps My web page is a news we page and we have thousands of articles in every section. For example we have an area that is called technology We have articles since 1999!! So the question is how can Make googl robot index them? Months ago when you enter the section technology we used to have a paginator without limits, but we notice that this query consume a lot of CPU per user every time was clicked. So we decide to limit to 10 pages with 1 records. Now it works great BUT I can see in google webmaster tools that our index decreased dramatically The answer is very easy, the bot doesn't have a way to get older technoly news articles because we limit he query to 150 records total Well, the Questin is how can I fix this? Options: 1) leave the query without limits 2) create a new button " all tech news" with a different query without a limit but paginated with (for example) 200 records each page 3) Create a sitemap that contain all the tech articles Any idea? Really thanks.
Technical SEO | | informatica8100 -
Hit by Google
My site - www.northernlightsiceland.com - has been hit by google and Im not sure why. The traffic dropped 75% last 24 hours and all the most important keywords have dropped significantly in the SERP. The only issue I can think of are the subpages for the northern lights forecasting I did every day e.g. http://www.northernlightsiceland.com/northern-lights-forecast-iceland-3-oct-2012/ I have been simply doing a copy/paste for 1 month the same subpage, but only changing the top part (Summary) for each day. Could this be the reason why Im penalized? I have now simply taken them all down minus the last 3 days (that are relevant). What can I do to get up on my feet again? This is mission critical for me as you can imagine. Im wondering if it got hit by this EMD update on 28 sept that was focusing on exact match domains http://www.webmasterworld.com/google/4501349-1-30.htm
Technical SEO | | rrrobertsson0 -
Google Webmaster Tools Reporting False Links
I was looking at Google Webmaster Tools and the amount of links that are reported in there are inaccurate. They reported over 50,000 links that created a huge spike in their link graph and I checked some of the links and they don't even have the link on their site. Can anyone help with this?
Technical SEO | | TopFloor0 -
Strange Top URLs for Keywords in Google Webmaster Tools
When we click on one of our keywords under the keywords section of Google Webmaster Tools it shows our top URLs for that keyword. The problem is that it is giving us some very strange URLs that we have searched high and low to try to find but we don't know where they came from. Here is a screenshot: http://bit.ly/pl6mB3 Do you know where this type of URL string could have originated and how to fix it?
Technical SEO | | Hakkasan0