Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Is "last modified" time in XML Sitemaps important?
-
My Tech lead is concerned that his use of a script to generate XML sitemaps for some client sites may be causing negative issues for those sites.
His concern centers around the fact that the script generates a sitemap which indicates that every URL page in the site was last modified at the exact same date and time. I have never heard anything to indicate that this might be a problem, but I do know that the sitemaps I generate for other client sites can choose server response or not.
What is the best way to generate the sitemap? Last mod from actual time modified, or all set at one date and time?
-
Glad to be of help Sha
-
Thanks Alan,
I will continue to use the server response setting when generating other sitemaps and recommend that our Techs ditch the home grown script that assigns the single date and time in future.
II must say also, it is great to have such clear and reliable advice - very glad to have you around!
Thanks again.
-
Sitemap.xml files are one of many "hints" search engines use to evaluate, classify and otherwise associate relevance, importance and freshness of individual pages, and in turn, an entire site.
When the entire file flags every page with the same date/time it can have a negative impact, purely from the single-point signal perspective. If the actual pages themselves have different date/time stamps at the HTML code level, those would counter the sitemap.xml file reporting, and either resolve it or just cause confusion.
Any time search engines have a potential conflict that needs to be resolved, the potential for less than maximum value exists.
Because of these combined potential problems, SEO best practices dictate that this issue be resolved, so as to ensure it does not, in fact, lead to problems, however minor they might be on a per-page basis. If resolving the issue takes an extensive amount of time, an evaluation of how important the issue is to overall SEO. At a certain point, you cross into the realm of diminishing returns.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
If I'm using a compressed sitemap (sitemap.xml.gz) that's the URL that gets submitted to webmaster tools, correct?
I just want to verify that if a compressed sitemap file is being used, then the URL that gets submitted to Google, Bing, etc and the URL that's used in the robots.txt indicates that it's a compressed file. For example, "sitemap.xml.gz" -- thanks!
Technical SEO | | jgresalfi0 -
Which Sitemap to keep - Http or https (or both)
Hi, Just finished upgrading my site to the ssl version (like so many other webmasters now that it may be a ranking factor). FIxed all links, CDN links are now secure, etc and 301 Redirected all pages from http to https. Changed property in Google Analytics from http to https and added https version in Webmaster Tools. So far, so good. Now the question is should I add the https version of the sitemap in the new HTTPS site in webmasters or retain the existing http one? Ideally switching over completely to https version by adding a new sitemap would make more sense as the http version of the sitemap would anyways now be re-directed to HTTPS. But the last thing i can is to get penalized for duplicate content. Could you please suggest as I am still a rookie in this department. If I should add the https sitemap version in the new site, should i delete the old http one or no harm retaining it.
Technical SEO | | ashishb010 -
Schema markup for products is missing "price": Is this bad?
Hey guys, So a current client of mine has an e-commerce shop with a few hundred products. They purposely choose to keep the prices off of their website, which is causing errors in Google Webmaster Tools. Basically the error shows: Error: Structured Data > Product (markup: schema.org) Error type: missing price 208 items with error Is this a huge deal? Or are we allowed to have non-numerical prices for schema ie. "call for quote"
Technical SEO | | tbinga1 -
Is there a maximum sitemap size?
Hi all, Over the last month we've included all images, videos, etc. into our sitemap and now its loading time is rather high. (http://www.troteclaser.com/sitemap.xml) Is there any maximum sitemap size that is recommended from Google?
Technical SEO | | Troteclaser0 -
How can I Style Long "List Posts" in Wordpress?
Hi All, I have been working on a list-post which spans over 100 items. Each item on the list has a quick blurb to explain it, an image and a few resource links. I am trying to find an attractive way to present this long list post in Wordpress. I have seen several sites with long list posts however; they place their items one on top of the other which yields a VERY long page and the end user has to do a lot of scrolling. Others turn their lists into slideshows, but I have no data on how slides perform against 10-mile-long-lists which load in 1 page. I would like to do something similar to what List25.com does as they present about 5-10 items per page and they seem to have pagination. The pagination part I understand however; is there a shortcode plugin to format lists in an attractive way just like list25?
Technical SEO | | IvanC0 -
"Fourth-level" subdomains. Any negative impact compared with regular "third-level" subdomains?
Hey moz New client has a site that uses: subdomains ("third-level" stuff like location.business.com) and; "fourth-level" subdomains (location.parent.business.com) Are these fourth-level addresses at risk of being treated differently than the other subdomains? Screaming Frog, for example, doesn't return these fourth-level addresses when doing a crawl for business.com except in the External tab. But maybe I'm just configuring the crawls incorrectly. These addresses rank, but I'm worried that we're losing some link juice along the way. Any thoughts would be appreciated!
Technical SEO | | jamesm5i0 -
How Does Google's "index" find the location of pages in the "page directory" to return?
This is my understanding of how Google's search works, and I am unsure about one thing in specific: Google continuously crawls websites and stores each page it finds (let's call it "page directory") Google's "page directory" is a cache so it isn't the "live" version of the page Google has separate storage called "the index" which contains all the keywords searched. These keywords in "the index" point to the pages in the "page directory" that contain the same keywords. When someone searches a keyword, that keyword is accessed in the "index" and returns all relevant pages in the "page directory" These returned pages are given ranks based on the algorithm The one part I'm unsure of is how Google's "index" knows the location of relevant pages in the "page directory". The keyword entries in the "index" point to the "page directory" somehow. I'm thinking each page has a url in the "page directory", and the entries in the "index" contain these urls. Since Google's "page directory" is a cache, would the urls be the same as the live website (and would the keywords in the "index" point to these urls)? For example if webpage is found at wwww.website.com/page1, would the "page directory" store this page under that url in Google's cache? The reason I want to discuss this is to know the effects of changing a pages url by understanding how the search process works better.
Technical SEO | | reidsteven750 -
Should I import external reviews to my site?
Hi everybody! I manage the website for a financial services company. We have more than 5000 reviews on a user review website. We have the possibility to import and display all these reviews on our site. Is this good for SEO? Will Google find it suspicious that our site suddenly displays a lot of new keyword-rich content? What about duplicate content? Please, share your thoughts. Thanks!
Technical SEO | | Georgios0