How often should I upload a new sitemap in google webmasters?
-
So I have a real estate website that is regularly changing listings, photos, data. Every time a new listing is added it creates a page for that listing.
My question is how frequently should I be recreating a new xml sitemap and uploading it to google webmasters?
Thanks in advance.
-
Hi Scott, depending on the volume of new pages being created you could always make use of the 'Fetch as Google' contained in the 'Health' tab facility in your Webmaster account to make sure that your most important new pages are indexed.
Periodically you could add a new sitemap but on a daily or perhaps weekly basis submitting the important pages for indexing through your Webmaster account could be the way to go.
-
Thanks Sergio,
So that leads me to another question, if I upload a new sitemap to google webmasters will it start fresh at zero or will it pick up where it left off. I noticed for the last couple weeks google has been indexing between 500-1000 pages per day. Then for the last few days it seems like they have slowed down. Any thoughts?
Scott
-
Hi Scott,
It depends if you want those new pages to be indexed fast. If you want them to be indexed as soon as possible, is a good idea to make a new sitemap because the next time bots crawl your web, they will find new pages easily and then can index them. If indexing new pages is not urgent for you, then wait.
Obviously, if you create a high number of pages which may be indexed per day, create a new sitemap each time you have a new page could be a hard work.
You have to stablish your needs, for example a new sitemap/day, or a new sitemap/10 new pages, etc.
If you use some CMS, like WordPress, there are some useful plugins that allow you automate this task.
I hope my answer would be helpful.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
XML sitemap and rel alternate hreflang requirements for Google Shopping
Our company implemented Google Shopping for our site for multiple countries, currencies and languages. Every combination of language and country is accessible via a url path and for all site pages, not just the pages with products for sale. I was not part of the project. We support 18 languages and 14 shop countries. When the project was finished we had a total of 240 language/country combinations listed in our rel alternate hreflang tags for every page and 240 language/country combinations in our XML sitemap for each page and canonicals are unique for every one of these page. My concern is with duplicate content. Also I can see odd language/country url combinations (like a country with a language spoken by a very low percentage of people in that country) are being crawled, indexed, and appearing in serps. This uses up my crawl budget for pages I don't care about. I don't this it is wise to disallow urls in robots.txt for that we are simultaneously listing in the XML sitemap. Is it true that these are requirements for Google Shopping to have XML sitemap and rel alternate hreflang for every language/country combination?
Technical SEO | | awilliams_kingston0 -
Sitemaps: Good Image And Video Sitemap Generators
Hello, We are trying to update our sitemap. We have currently updated our XML and HTML sitemaps but would like to have an image and video sitemap also. Can anyone recommend a good image and video sitemap generator? Kind regards, | Deeana Radley Web Designer & SEO Assistant Phone: 01702 460047 Email: dee@solopress.com Google+: +DeeanaRadley Twitter: @DeeanaRadley |
Technical SEO | | SolopressPrint0 -
Pages to be indexed in Google
Hi, We have 70K posts in our site but Google has scanned 500K pages and these extra pages are category pages or User profile pages. Each category has a page and each user has a page. When we have 90K users so Google has indexed 90K pages of users alone. My question is. Should we leave it as they are or should we block them from being indexed? As we get unwanted landings to the pages and huge bounce rate. If we need to remove what needs to be done? Robots block or Noindex/Nofollow Regards
Technical SEO | | mtthompsons0 -
My sitemap in Google is coming back with an error
I submitted my xml sitemap to Google Webmaster tools. It is giving an error, not found. 404 Error. But I can't figure out why my site map is signaling a 404. Why? 😞
Technical SEO | | cschwartzel0 -
Should I include tags in sitemap?
Hello All, I was wondering if you should include tags and categories in your sitemap. In the past on previous blogs I have always left tags and categories out. The reason for this is a good friend of mine who has been doing SEO for a long time and inhouse always told me that this would result in duplicate content. I thought that it would be a great idea to get some input from the SEOmoz community as this obviously has a big affect on your blog and the number of pages indexed. Any help would be great. Thanks, Luke Hutchinson.
Technical SEO | | LukeHutchinson1 -
Sitemaps
Hi, I have doubt using sitemaps My web page is a news we page and we have thousands of articles in every section. For example we have an area that is called technology We have articles since 1999!! So the question is how can Make googl robot index them? Months ago when you enter the section technology we used to have a paginator without limits, but we notice that this query consume a lot of CPU per user every time was clicked. So we decide to limit to 10 pages with 1 records. Now it works great BUT I can see in google webmaster tools that our index decreased dramatically The answer is very easy, the bot doesn't have a way to get older technoly news articles because we limit he query to 150 records total Well, the Questin is how can I fix this? Options: 1) leave the query without limits 2) create a new button " all tech news" with a different query without a limit but paginated with (for example) 200 records each page 3) Create a sitemap that contain all the tech articles Any idea? Really thanks.
Technical SEO | | informatica8100 -
Google Off/On Tags
I came across this article about telling google not to crawl a portion of a webpage, but I never hear anyone in the SEO community talk about them. http://perishablepress.com/press/2009/08/23/tell-google-to-not-index-certain-parts-of-your-page/ Does anyone use these and find them to be effective? If not, how do you suggest noindexing/canonicalizing a portion of a page to avoid duplicate content that shows up on multiple pages?
Technical SEO | | Hakkasan1