Google Webmaster tools: Sitemap.xml not processed everyday
-
Hi,
We have multiple sites under our google webmaster tools account with each having a sitemap.xml submitted
Each site's sitemap.xml status ( attached below ) shows it is processed everyday except for one
_Sitemap: /sitemap.xml__This Sitemap was submitted Jan 10, 2012, and processed Oct 14, 2013._But except for one site ( coed.com ) for which the sitemap.xml was processed only on the day it is submitted and we have to manually resubmit every day to get it processed.Any idea on why it might?thank you
-
My initial reaction was that this is more likely technical than something Google is doing - checking the load-time is a good idea. Make sure the sitemap validates and there's nothing odd about it. If you manually re-submit it, does it seem to take?
-
Just so I am clear, you have been waiting and finally Google processed it, or was it sitting there and someone took an action which caused Google to process it?
I am surprised that nothing happened for nearly two years. Has the site had traffic, etc.? Any warnings, manual actions, etc.?
Thanks
-
Thanks Paul,
On custom crawl setting, I just verified and they remain same same across all our sites.
Yes, sitemap is dynamic but rendered via cache which refreshed when new content get published, will check on the load time.
thank you
-
Thanks Robert,
All our sites are getting indexed for sure, but one site ( coed.com ) sitemap.xml on GWT says it was processed only on the day it submitted, while other sites ( collegecandy.com and bustedcoverage.com ) sitemap.xml GWT was getting processed everyday
Our sitemap.xml on all sites updates automatically when new content get published so I believe it has to be processed every day
-
Any chance there's been a custom crawl setting accidentally added to your Google Webmaster Tools, Robert? Some devs do this during development or it can happen accidentally.
Also, your sitmemap takes a ridiculously long time to load - well over 15 seconds for me and over 18 seconds using webpagetest.org. It could be that Google simply isn't waiting for the page to load when it tries to visit. If the sitemap's being generated dynamically, you may have a rendering problem. Otherwise there's something borked when a 50kb file takes that long.
Might also want to try submitting it through Bing Webmaster Tools and see if they are better able to index it consistently for comparison?
Bit of a head-scratcher. Hope that gives you a starting point.
Paul
-
Robert,
There is no screenshot attached, but I am unaware of sitemaps being processed daily by search engines. What are you trying to achieve by continuously resubmitting the sitemap?
The site is indexed, correct? And when you look at crawl stats it is showing the site being crawled on some semi regular basis? Google does not process your sitemap every day.
Hope that helps,
Robert
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Getting Recrawled by Google
I have been updating my site a lot and some of the updates are showing up in Google and some are not. Is there a best practice in getting your site fully recrawled by Google?
Technical SEO | | ShootTokyo0 -
When do you use 'Fetch as a Google'' on Google Webmaster?
Hi, I was wondering when and how often do you use 'Fetch as a Google'' on Google Webmaster and do you submit individual pages or main URL only? I've googled it but i got confused more. I appreciate if you could help. Thanks
Technical SEO | | Rubix1 -
Will an XML sitemap override a robots.txt
I have a client that has a robots.txt file that is blocking an entire subdomain, entirely by accident. Their original solution, not realizing the robots.txt error, was to submit an xml sitemap to get their pages indexed. I did not think this tactic would work, as the robots.txt would take precedent over the xmls sitemap. But it worked... I have no explanation as to how or why. Does anyone have an answer to this? or any experience with a website that has had a clear Disallow: / for months , that somehow has pages in the index?
Technical SEO | | KCBackofen0 -
De-indexed from Google
Hi Search Experts! We are just launching a new site for a client with a completely new URL. The client can not provide any access details for their existing site. Any ideas how can we get the existing site de-indexed from Google? Thanks guys!
Technical SEO | | rikmon0 -
Persistent Unnatural Links in Webmaster tools
We recently were notified about unnatural links from two websites (totalling a few thousands links each). We went to the websites and asked them to remove the links, which they apparently did. After this we applied for reconsideration to Google, explaining the situation, however they came back and said we still have links. We noticed there were still links, however there were less than before, and so we once again asked the sites to remove all the links. Now we are sure all the links are gone as when we click a random link and view the page source there is no reference to our site, however WebMaster tools is not updating the link list, claiming we still have thousands of links. Do we have to apply for another reconsideration request to get them to re-crawl the sites to get rid of the links, or should it happen automatically?
Technical SEO | | eXia0 -
Received Google Webmaster Tools notice of detected unnatural links , but no negative impact on ranking and traffic, What should i do next?
Hello, On May 19 , 2012 Google webmaster sent notification "Google Webmaster Tools notice of detected unnatural links to " both sites haven't lost any ranking or traffic as yet. I am worried, Should i panic,what should i be doing , will it be going down anytime soon? how to naturally build links?
Technical SEO | | conversiontactics0 -
Sitemap.xml - autogenerated by CMS is full of crud
Hi all, hope you can help. the Magento ecommerce system I'm working with autogenerates sitemap.xml - it's well formed with priority and frequency parameters. However, it has generated lots of URLs that are pointing to broken pages returning fatal erros, duplicate URLs (not canonicals), 404s etc I'm thinking of hand creating sitemap.xml - the site has around 50 main pages including products and categories, and I can get the main page URLs listed by screaming frog or xenu. Then I'll have to get into the hand editing the crud pages with noindex, and useful duplicates with canonicals. Is this the way to go or is there another solution thanks in advance for any advice
Technical SEO | | k3nn3dy30 -
Google webmaster tools
I have linked webmaster tools to Google analytics account. My question is where can i see Webmaster reports in Google analytics ?
Technical SEO | | seoug_20050