Proper sitemap update frequency
-
I have 12 sitemaps submitted to Google. After about a week, Google is about 50% of the way through crawling each one.
In the past week I've created many more pages. Should I wait until Google is 100% complete with my original sitemaps or can I just go ahead and refresh them? When I refresh the original files will have different URLs.
-
You want your sitemap to include all your important URLs. Don't remove them from the sitemap just because you have been crawled.
-
Agreed, I don't see any issue with it, if you have more urls submit them if you can.
-
Nah, I don't think so. If they havent gotten to them yet, it shouldnt affect it. You could probably change the URL's, change the name of the sitemap, etc and have it not do anything.
If anything, you would want them to find the new URL's before its done with the first crawl, rather than index something that is no longer correct.
-
Thanks David. To clarify, the urls haven't changed I've just added more of them.
I am wondering if it will "throw google off" if I uploaded all new sitemaps that had different URLs in them before it's done with first crawl. I am getting good crawl frequency now and didn't want to disrupt it.
Does that make sense or change your answer at all?
Thanks again.
-
If you have URL's that changed, I would resubmit. If Google hasn't found them yet, what difference would it make to submit more that haven't been found yet? When they do crawl them, you will have them crawling the right and updated URL locations
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
410 or 301 after URL update?
Hi there, A site i'm working on atm has a thousand "not found" errors on google console (of course, I'm sure there are thousands more it's not showing us!). The issue is a lot of them seem to come from a URL change. Damage has been done, the URLs have been changed and I can't stop that... but as you can imagine, i'm keen to fix as many as humanly possible. I don't want to go mad with 301s - but for external links in, this seems like the best solution? On the other hand, Google is reading internal links that simply aren't there anymore. Is it better to hunt down the new page and 301-it anyway? OR should I 410 and grit my teeth while google crawls and recrawls it, warning me that this page really doesn't exist? Essentially I guess I'm asking, how many 301s are too many and will affect our DA? And what's the best solution for dealing with mass 404 errors - many of which aren't attached or linked to from any other pages anymore? Thanks for any insights 🙂
Intermediate & Advanced SEO | | Fubra0 -
In Search Console, why is the XML sitemap "issue" count 5x higher than the URL submission count?
Google Search Console is telling us that there are 5,193 sitemap "issues" - URLs that are present on the XML sitemap that are blocked by robots.txt However, there are only 1,222 total URLs submitted on the XML sitemap. I only found 83 instances of URLs that fit their example description. Why is the number of "issues" so high? Does it compound over time as Google re-crawls the sitemap?
Intermediate & Advanced SEO | | FPD_NYC0 -
Site recovery after manual penalty, disavow, SSL, Mobile update = but dropped again in May
I have a site that has had a few problems over the last year. We had a manual penalty in late 2013 for bad links, some from guest blogs and some from spammy sites. Reconsideration requests had me disavow almost all of the incoming links. Later in 2014, the site was hit with link injection malware and had another manual penalty. That was cleared up and manual penalty removed in Jan 2015. During this time the site was moved to SSL, but there were some redirect problems. By Feb 2015 everything was cleared up and a an updated disavow list was added. The site recovered in March and did great. A mobile version was added in April. About May 1st rankings dropped again. Traffic is about 40% off it's March levels. Recently I read that a new disavow file will supersede an old one, and if all of the original domains and URLs aren't included in the new disavow file they will no longer be disavowed. Is this true? If so, is it possible that a smaller disavow file uploaded in Feb would cause rankings to drop after the May 3 Quality update? Can I correct this by disavowing all the previously disavowed domains and URLs? Any advice for determining why the site is performing poorly again? We have well written content, regular blogs, nothing that seems like it should violate the Google guidelines.
Intermediate & Advanced SEO | | Robertjw0 -
Should I bother with a Video Sitemap?
Morning all, I've started a pretty aggressive Video content push in recent weeks. All our videos are on our YouTube channel. I decided to go with hosting the videos on YouTube based on my research on moz.com, especially considering the potential reach of the content on YouTube. What I'm finding is that the YouTube channel is doing great. We've hit 200 subscribers and 15K views in a little under a month. Wayyyy more than I could have ever hoped for. But the blog posts on our website are getting minimal traffic and no search visibility. That doesn't necessarily bother me, since the intention of our marketing campaign is to use YouTube to drive traffic to our website. So I guess my question is really more to do with optimizing the site with Video Sitemaps and best practices for Google Webmaster Tools. Right now we have YouTube videos embedded on blog posts like this one that have a time-stamp. But I've been working to create Gallery-style pages (no time-stamp) which would have multiple YouTube videos embedded on them like this one. These make it easier for visitors to watch multiple videos without needing to skip around to multiple blog posts. The challenge I'm running into is that when I go to submit a Video Sitemap to GWT I get an error saying that I have duplicate page content within the video sitemap. I've used several WP plugins to do this. It seems that when there is a video embedded on multiple URLs (pages + posts) the plugins will ignore the posts and only add the pages to the video sitemap. Here is my regular Sitemap Here is my video Sitemap I've attached a screenshot of my current Yoast Video SEO config if that's useful for reference. Does anyone have experience with using multiple sitemaps in GWT? I'm starting to think that maybe I shouldn't even bother with a video sitemap. Maybe those gallery-style pages should just go in the regular sitemap? Any thoughts or advice would be highly appreciated! Thanks llQfydA
Intermediate & Advanced SEO | | TMHoward860 -
Sitemap Folders on Search Results
Hello! We are managing SEO campaign of a video website. We have an issue about sitemap folders. I have sitemaps like ** /xml/sitemap-name.xml .** But Google is indexing my /xml/ folder and also sitemaps and they appear in search results. If i will add Disallow: /xml/ to my robots.txt and remove /xml/ folder from webmaster tools, Google could see my sitemaps? or it ignores them? Will my site effect negatively after remove /xml/ folder completely from search results? What should i do?
Intermediate & Advanced SEO | | roipublic0 -
What is the practical influence of priority in a sitemap?
I have a directory site with 1000s of entries. Will there be benefit to be gained from playing with various entries priorities in the sitemap? I was thinking I might give more priority to entries that have upgraded their directory entry. Thanks.
Intermediate & Advanced SEO | | flow_seo0 -
XML Sitemap instruction in robots.txt = Worth doing?
Hi fellow SEO's, Just a quick one, I was reading a few guides on Bing Webmaster tools and found that you can use the robots.txt file to point crawlers/bots to your XML sitemap (they don't look for it by default). I was just wondering if it would be worth creating a robots.txt file purely for the purpose of pointing bots to the XML sitemap? I've submitted it manually to Google and Bing webmaster tools but I was thinking more for the other bots (I.e. Mozbot, the SEOmoz bot?). Any thoughts would be appreciated! 🙂 Regards, Ash
Intermediate & Advanced SEO | | AshSEO20110 -
Sitemap in SERPS
What's up guys, Having some troubles with SERP rankings. My sitemap (navigation) is appearing instead of my actual keywords. I have tried a few methods to fix this; setting a preferred domain, using a 301 redirects, deleting out of date pages via Google webmaster tools. Nothing seems to work. My next step was to refresh the cache for my entire site - does anyone know how to do this? Can't see any tools... Any help would be great. Cheers, Jon.
Intermediate & Advanced SEO | | jamesjk240