Proper sitemap update frequency
-
I have 12 sitemaps submitted to Google. After about a week, Google is about 50% of the way through crawling each one.
In the past week I've created many more pages. Should I wait until Google is 100% complete with my original sitemaps or can I just go ahead and refresh them? When I refresh the original files will have different URLs.
-
You want your sitemap to include all your important URLs. Don't remove them from the sitemap just because you have been crawled.
-
Agreed, I don't see any issue with it, if you have more urls submit them if you can.
-
Nah, I don't think so. If they havent gotten to them yet, it shouldnt affect it. You could probably change the URL's, change the name of the sitemap, etc and have it not do anything.
If anything, you would want them to find the new URL's before its done with the first crawl, rather than index something that is no longer correct.
-
Thanks David. To clarify, the urls haven't changed I've just added more of them.
I am wondering if it will "throw google off" if I uploaded all new sitemaps that had different URLs in them before it's done with first crawl. I am getting good crawl frequency now and didn't want to disrupt it.
Does that make sense or change your answer at all?
Thanks again.
-
If you have URL's that changed, I would resubmit. If Google hasn't found them yet, what difference would it make to submit more that haven't been found yet? When they do crawl them, you will have them crawling the right and updated URL locations
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Content update on 24hr schedule
Hello! I have a website with over 1300 landings pages for specific products. These individual pages update on a 24hr cycle through out API. Our API pulls reviews/ratings from other sources and then writes/updates that content onto the page. Is that 'bad"? Can that be viewed as spammy or dangerous in the eyes of google? (My first thought is no, its fine) Is there such a thing as "too much content". For example if we are adding roughly 20 articles to our site a week, is that ok? (I know news websites add much more than that on a daily basis but I just figured I would ask) On that note, would it be better to stagger our posting? For example 20 articles each week for a total of 80 articles, or 80 articles once a month? (I feel like trickle posting is probably preferable but I figured I would ask.) Is there any negatives to the process of an API writing/updating content? Should we have 800+ words of static content on each page? Thank you all mozzers!
Intermediate & Advanced SEO | | HashtagHustler0 -
Any excellent recommendations for a sitemap.xml plugin?
Hi, I'm trying to find a sitemap generator/plugin that I can point my client to. My client is using Magento, and is one of the largest sports store i Norway (around 20 000 products). I've heard there's one that can set the <priority>according to page views, sold units, and other relevant parameters, and that also takes care of the other elements in the sitemap.xml.</priority> Any good recommendations out there? 🙂
Intermediate & Advanced SEO | | Inevo0 -
Sitemap with homepage URL repeated several times - it is a problem?
Hello Mozzers, I am looking at a website with the homepage repeated several times (4 times) on the sitemap (sitemap is autogenerated via a plugin) - is this an SEO problem do you think - might it damage SEO performance, or can I ignore this issue? I am thinking I can ignore, yet it's an odd "issue" so your advice would be welcome! Thanks, Luke
Intermediate & Advanced SEO | | McTaggart0 -
I'm updating content that is out of date. What is the best way to handle if I want to keep old content as well?
So here is the situation. I'm working on a site that offers "Best Of" Top 10 list type content. They have a list that ranks very well but is out of date. They'd like to create a new list for 2014, but have the old list exist. Ideally the new list would replace the old list in search results. Here's what I'm thinking, but let me know if you think theres a better way to handle this: Put a "View New List" banner on the old page Make sure all internal links point to the new page Rel=canonical tag on the old list pointing to the new list Does this seem like a reasonable way to handle this?
Intermediate & Advanced SEO | | jim_shook0 -
Sort term product pages and fast indexing - XML sitemaps be updated daily, weekly, etc?
Hi everyone, I am currently working on a website that the XML sitemap is set to update weekly. Our client has requested that this be changed to daily. The real issue is that the website creates short term product pages (10-20 days) and then the product page URL's go 404. So the real problem is quick indexing not daily vs weekly sitemap. I suspect that daily vs weekly sitemaps may help solve the indexing time but does not completely solve the problem. So my question for you is how can I improve indexing time on this project? The real problem is how to get the product pages indexed and ranking before the 404 page shows u?. . Here are some of my initial thoughts and background on the project. Product pages are only available for 10 to 20 days (Auction site).Once the auction on the product ends the URL goes 404. If the pages only exist for 10 to 20 days (404 shows up when the auction is over), this sucks for SEO for several reasons (BTW I was called onto the project as the SEO specialist after the project and site were completed). Reason 1 - It is highly unlikely that the product pages will rank (positions 1 -5) since the site has a very low Domain Authority) and by the time Google indexes the link the auction is over therefore the user sees a 404. Possible solution 1 - all products have authorship from a "trustworthy" author therefore the indexing time improves. Possible solution 2 - Incorporate G+ posts for each product to improve indexing time. There is still a ranking issue here since the site has a low DA. The product might appear but at the bottom of page 2 or 1..etc. Any other ideas? From what I understand, even though sitemaps are fed to Google on a weekly or daily basis this does not mean that Google indexes them right away (please confirm). Best case scenario - Google indexes the links every day (totally unrealistic in my opinion), URL shows up on page 1 or 2 of Google and slowly start to move up. By the time the product ranks in the first 5 positions the auction is over and therefore the user sees a 404. I do think that a sitemap updated daily is better for this project than weekly but I would like to hear the communities opinion. Thanks
Intermediate & Advanced SEO | | Carla_Dawson0 -
How long for Google Webmaster tools to update/reflect link changes
Hi all, Does anyone know or have experience of how long GWMT takes to update its data?, we did some work on our link profile back in October/November but are still seeing old links (removed) showing in GWMT. Thanks in advance,
Intermediate & Advanced SEO | | righty0 -
Sitemaps and subdomains
At the beginning of our life-cycle, we were just a wordpress blog. However, we just launched a product created in Ruby. Because we did not have time to put together an open source Ruby CMS platform, we left the blog in wordpress and app in rails. Thus our web app is at http://www.thesquarefoot.com and our blog is at http://blog.thesquarefoot.com. We did re-directs such that if the URL does not exist at www.thesquarefoot.com it automatically forwards to blog.thesquarefoot.com. What is the best way to handle sitemaps? Create one for blog.thesquarefoot.com and for http://www.thesquarefoot.com and submit them separately? We had landing pages like http://www.thesquarefoot.com/houston in wordpress, which ranked well for Find Houston commercial real estate, which have been replaced with a landing page in Ruby, so that URL works well. The url that was ranking well for this word is now at blog.thesquarefoot.com/houston/? Should i delete this page? I am worried if i do, we will lose ranking, since that was the actual page ranking, not the new one. Until we are able to create an open source Ruby CMS and move everything over to a sub-directory and have everything live in one place, I would love any advice on how to mitigate damage and not confuse Google. Thanks
Intermediate & Advanced SEO | | TheSquareFoot0