Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Using Sitemap Generator - Good/Bad?
-
Hi all
I recently purchased the full licence of XML Sitemap Generator (http://www.xml-sitemaps.com/standalone-google-sitemap-generator.html) but have yet used it.
The idea behind this is that I can deploy the package on each large e-commerce website I build and the sitemap will be generated as often as I set it be and the search engines will also be pinged automatically to inform them of the update. No more manual XML sitemap creation for me!
Now it sounds great but I do not know enough about pinging search engines with XML sitemap updates on a regular basis and if this is a good or bad thing?
Can it have any detrimental effect when the sitemap is changing (potentially) every day with new URLs for products being added to the site?
Any thoughts or optinions would be greatly appreciated.
Kris
-
It would certainly not have any impact on the existing rankings and crawl rate. infact the crawl rate would improve
-
Hi Khem
Yes I fully understood your response, thank you.
We always do sumbit a sitemap file for every site we build however we never really update the sitemap from there on, we tend to leave the search engines to crawl the site in order to find new pages or detect pages that have been removed. I assume this is a normal practice for many other developers out there?
We always do a 301 redirect for pages which have been un-published such as old products and categories.
My main concern was actually creating a new sitemap file every day and if this would have any effect on the existing rankings, or crawl rate of the site. I guess not!
Kris
-
Well to answer you question. Yes, you should always use xml sitemap also submit it with search engines. It helps search engines to access all the pages of your websites. If fact you can even tell search engines about your most important and less important pages.
It also enables you to tell Search Engines about the content update frequency so that search engine could crawl those again which you update daily/ weekly.
Furthermore, there is no problem if you update the XML file daily as long as you're not removing pages. However, if you need to remove pages, keep them in sitemap for at least one week and redirect old pages to new ones.
Hope I was able to understand your question and answered properly
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Sudden Indexation of "Index of /wp-content/uploads/"
Hi all, I have suddenly noticed a massive jump in indexed pages. After performing a "site:" search, it was revealed that the sudden jump was due to the indexation of many pages beginning with the serp title "Index of /wp-content/uploads/" for many uploaded pieces of content & plugins. This has appeared approximately one month after switching to https. I have also noticed a decline in Bing rankings. Does anyone know what is causing/how to fix this? To be clear, these pages are **not **normal /wp-content/uploads/ but rather "index of" pages, being included in Google. Thank you.
Technical SEO | | Tom3_150 -
Resubmit sitemaps on every change?
Hello Mozers, Our sitemaps were submitted to Google and Bing, and are successfully indexed. Every time pages are added to our store (ecommerce), we re-generate the xml sitemap. My question is: should we be resubmitting the sitemaps every time their content change, or since they were submitted once can we assume that the crawlers will re-download the sitemaps by themselves (I don't like to assume). What are best practices here? Thanks!
Technical SEO | | yacpro131 -
Sitemap indexed pages dropping
About a month ago I noticed my pages indexed from my sitemap are dropping.There are 134 pages in my sitemap and only 11 are indexed. It used to be 117 pages and just died off quickly. I still seem to be getting consistant search traffic but I'm just not sure whats causing this. There are no warnings or manual actions required in GWT that I can find.
Technical SEO | | zenstorageunits0 -
Website credits for designers - good or bad
Hi My core service is web design and development. I often place a credit on my clients websites pointing them back to my web design or web development pages. Is this a wise practice with penguin and panda updates? Would this also pull my ranking down?
Technical SEO | | Cocoonfxmedia0 -
Can you have a /sitemap.xml and /sitemap.html on the same site?
Thanks in advance for any responses; we really appreciate the expertise of the SEOmoz community! My question: Since the file extensions are different, can a site have both a /sitemap.xml and /sitemap.html both siting at the root domain? For example, we've already put the html sitemap in place here: https://www.pioneermilitaryloans.com/sitemap Now, we're considering adding an XML sitemap. I know standard practice is to load it at the root (www.example.com/sitemap.xml), but am wondering if this will cause conflicts. I've been unable to find this topic addressed anywhere, or any real-life examples of sites currently doing this. What do you think?
Technical SEO | | PioneerServices0 -
Host sitemaps on S3?
Hey guys, I run a dynamic web service and I will start building static sitemaps for it pretty soon. The fact that my app lives in a multitude of servers doesn't make it easy to distribute frequently updated static files throughout the servers. My idea was to host the files in AWS S3 and point my robots.txt sitemap directive there. I'll use a sitemap index so, every other sitemap will be hosted on S3 as well. I could dynamically mirror the content from the files in S3 through my app, but that would be a little more resource intensive than just serving the static files from a common place. Any ideas? Thanks!
Technical SEO | | tanlup0 -
How does Google find /feed/ at the end of all pages on my site?
Hi! In Google Webmaster Tools I find *.../feed/ as a 404 page in crawl errors. The problem is that none of these pages exist and they have no inbound links (except the start page). FYI, it´s a wordpress site. Example: www.mysite.com/subpage1/feed/ www.mysite.com/subpage2/feed/ www.mysite.com/subpage3/feed/ etc Does Google search for /feed/ by default or why do I keep getting these 404´s every day?
Technical SEO | | Vivamedia0 -
Best XML Sitemap generator
Do you guys have any suggestions on a good XML Sitemaps generator? hopefully free, but if it's good i'd consider paying I am using a MAC so would prefer a online or mac version
Technical SEO | | kevin48030