Proper sitemap update frequency
-
I have 12 sitemaps submitted to Google. After about a week, Google is about 50% of the way through crawling each one.
In the past week I've created many more pages. Should I wait until Google is 100% complete with my original sitemaps or can I just go ahead and refresh them? When I refresh the original files will have different URLs.
-
You want your sitemap to include all your important URLs. Don't remove them from the sitemap just because you have been crawled.
-
Agreed, I don't see any issue with it, if you have more urls submit them if you can.
-
Nah, I don't think so. If they havent gotten to them yet, it shouldnt affect it. You could probably change the URL's, change the name of the sitemap, etc and have it not do anything.
If anything, you would want them to find the new URL's before its done with the first crawl, rather than index something that is no longer correct.
-
Thanks David. To clarify, the urls haven't changed I've just added more of them.
I am wondering if it will "throw google off" if I uploaded all new sitemaps that had different URLs in them before it's done with first crawl. I am getting good crawl frequency now and didn't want to disrupt it.
Does that make sense or change your answer at all?
Thanks again.
-
If you have URL's that changed, I would resubmit. If Google hasn't found them yet, what difference would it make to submit more that haven't been found yet? When they do crawl them, you will have them crawling the right and updated URL locations
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
410 or 301 after URL update?
Hi there, A site i'm working on atm has a thousand "not found" errors on google console (of course, I'm sure there are thousands more it's not showing us!). The issue is a lot of them seem to come from a URL change. Damage has been done, the URLs have been changed and I can't stop that... but as you can imagine, i'm keen to fix as many as humanly possible. I don't want to go mad with 301s - but for external links in, this seems like the best solution? On the other hand, Google is reading internal links that simply aren't there anymore. Is it better to hunt down the new page and 301-it anyway? OR should I 410 and grit my teeth while google crawls and recrawls it, warning me that this page really doesn't exist? Essentially I guess I'm asking, how many 301s are too many and will affect our DA? And what's the best solution for dealing with mass 404 errors - many of which aren't attached or linked to from any other pages anymore? Thanks for any insights đ
Intermediate & Advanced SEO | | Fubra0 -
Does anyone know of a Google update in the past few days?
Have seen a fairly substantial drop in Google search console, I'm still looking into it comparing things, but does anyone know if there's been a Google updates within the past few days? Or has anyone else noticed anything? Thanks
Intermediate & Advanced SEO | | seoman100 -
Is this a good sitemap hierarchy for a big eCommerce site (50k+ pages).
Hi guys, hope you're all good. I am currently in the process of designing a new sitemap hierarchy to ensure that every page on the site gets indexed and is accessible via Google. It's important that our sitemap file is well structured, divided and organised into relevant sub-categories to improve indexing. I just wanted to make sure that it's all good before forwarding onto the development team for them to consider. At the moment the site has everything thrown into /sitemap.xml/ and it exceeds the 50k limit. Here is what I have came up with: A primary sitemap.xml referencing other sitemap files, each of the following areas will have their own sitemap of which is referenced by /sitemap.xml/. As an example, sitemap.xml will contain 6 links, all of which link to other sitemaps. Product pages; Blog posts; Categories and sub categories; Forum posts, pages etc; TV specific pages (we have a TV show); Other pages. Is this format correct? Once it has been implemented I can then go ahead and submit all 6 separate sitemaps to webmaster tools + add a sitemap link to the footer of the site. All comments are greatly appreciated - if you know of a site which has a good sitemap architecture, please send the link my way! Brett
Intermediate & Advanced SEO | | Brett-S0 -
How to properly implement HTTPS?
We are looking at implementing HTTPS for our site. I have done a little research but can't find anything recent, http://moz.com/community/q/duplicate-content-and-http-and-https is the most recent thing I found. Does everything in the answers still apply? Should I just do a 301 redirect to all new https? Or add a canonical tag?
Intermediate & Advanced SEO | | EcommerceSite0 -
Why extreme drop in number of pages indexed via GWMT sitemaps?
Any tips on why our GWMT Sitemaps indexed pages dropped to 27% of total submitted entries (2290 pages submitted, 622 indexed)? Already checked the obvious Test Sitemap, valid URLs etc. We had typically been at 95% of submitted getting indexed.
Intermediate & Advanced SEO | | jkinnisch0 -
What Wordpress Update Services Should You Be Using on Your Wordpress Blog?
I have been told that pingomatic.com is all that you need however yesterday I went to a conference and others were recommending to have a good list of pinging services to cover all your bases Here are 4 that have been recommended: pingomatic technorati blogsearch.google.com feedburner Any others that should be included on this list? Â My goal is not to spam these ping lists however want to make sure my content is getting indexed quickly
Intermediate & Advanced SEO | | webestate0 -
New server update + wrong robots.txt = lost SERP rankings
Over the weekend, we updated our store to a new server. Â Before the switch, we had a robots.txt file on the new server that disallowed its contents from being indexed (we didn't want duplicate pages from both old and new servers). When we finally made the switch, we somehow forgot to remove that robots.txt file, so the new pages weren't indexed. Â We quickly put our good robots.txt in place, and we submitted a request for a re-crawl of the site. The problem is that many of our search rankings have changed. Â We were ranking #2 for some keywords, and now we're not showing up at all. Â Is there anything we can do? Â Google Webmaster Tools says that the next crawl could take up to weeks! Â Any suggestions will be much appreciated.
Intermediate & Advanced SEO | | 9Studios0 -
Sitemaps. When compressed do you use the .gz file format or the (untidy looking, IMHO) .xml.gz format?
When submitting compressed sitemaps to Google I normally use the a file named sitemap.gz A customer is banging on that his web guy says that sitemap.xml.gz is a better format. Google spiders sitemap.gz just fine and in Webmaster Tools everything looks OK... Interested to know other SEOmoz Pro's preferences here and also to check I haven't made an error that is going to bite me in the ass soon! Over to you.
Intermediate & Advanced SEO | | NoisyLittleMonkey0