Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Using Sitemap Generator - Good/Bad?
-
Hi all
I recently purchased the full licence of XML Sitemap Generator (http://www.xml-sitemaps.com/standalone-google-sitemap-generator.html) but have yet used it.
The idea behind this is that I can deploy the package on each large e-commerce website I build and the sitemap will be generated as often as I set it be and the search engines will also be pinged automatically to inform them of the update. No more manual XML sitemap creation for me!
Now it sounds great but I do not know enough about pinging search engines with XML sitemap updates on a regular basis and if this is a good or bad thing?
Can it have any detrimental effect when the sitemap is changing (potentially) every day with new URLs for products being added to the site?
Any thoughts or optinions would be greatly appreciated.
Kris
-
It would certainly not have any impact on the existing rankings and crawl rate. infact the crawl rate would improve
-
Hi Khem
Yes I fully understood your response, thank you.
We always do sumbit a sitemap file for every site we build however we never really update the sitemap from there on, we tend to leave the search engines to crawl the site in order to find new pages or detect pages that have been removed. I assume this is a normal practice for many other developers out there?
We always do a 301 redirect for pages which have been un-published such as old products and categories.
My main concern was actually creating a new sitemap file every day and if this would have any effect on the existing rankings, or crawl rate of the site. I guess not!
Kris
-
Well to answer you question. Yes, you should always use xml sitemap also submit it with search engines. It helps search engines to access all the pages of your websites. If fact you can even tell search engines about your most important and less important pages.
It also enables you to tell Search Engines about the content update frequency so that search engine could crawl those again which you update daily/ weekly.
Furthermore, there is no problem if you update the XML file daily as long as you're not removing pages. However, if you need to remove pages, keep them in sitemap for at least one week and redirect old pages to new ones.
Hope I was able to understand your question and answered properly
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Good to use disallow or noindex for these?
Hello everyone, I am reaching out to seek your expert advice on a few technical SEO aspects related to my website. I highly value your expertise in this field and would greatly appreciate your insights.
Technical SEO | | williamhuynh
Below are the specific areas I would like to discuss: a. Double and Triple filter pages: I have identified certain URLs on my website that have a canonical tag pointing to the main /quick-ship page. These URLs are as follows: https://www.interiorsecrets.com.au/collections/lounge-chairs/quick-ship+black
https://www.interiorsecrets.com.au/collections/lounge-chairs/quick-ship+black+fabric Considering the need to optimize my crawl budget, I would like to seek your advice on whether it would be advisable to disallow or noindex these pages. My understanding is that by disallowing or noindexing these URLs, search engines can avoid wasting resources on crawling and indexing duplicate or filtered content. I would greatly appreciate your guidance on this matter. b. Page URLs with parameters: I have noticed that some of my page URLs include parameters such as ?variant and ?limit. Although these URLs already have canonical tags in place, I would like to understand whether it is still recommended to disallow or noindex them to further conserve crawl budget. My understanding is that by doing so, search engines can prevent the unnecessary expenditure of resources on indexing redundant variations of the same content. I would be grateful for your expert opinion on this matter. Additionally, I would be delighted if you could provide any suggestions regarding internal linking strategies tailored to my website's structure and content. Any insights or recommendations you can offer would be highly valuable to me. Thank you in advance for your time and expertise in addressing these concerns. I genuinely appreciate your assistance. If you require any further information or clarification, please let me know. I look forward to hearing from you. Cheers!0 -
In writing the url, it is better to use the language used by the people of my country or English?
We speak Persian and all people search in Persian on Google. But I read in some sources that the url should be in English. Please tell me which language to use for url writing?
Technical SEO | | ghesta
For example, I brought down two models: 1fb0e134-10dc-4737-904f-bfdf07143a98-image.png https://ghesta.ir/blog/how-to-become-rich/
2)https://ghesta.ir/blog/چگونه-پولدار-شویم/0 -
2 sitemaps on my robots.txt?
Hi, I thought that I just could link one sitemap from my site's robots.txt but... I may be wrong. So, I need to confirm if this kind of implementation is right or wrong: robots.txt for Magento Community and Enterprise ...
Technical SEO | | Webicultors
Sitemap: http://www.mysite.es/media/sitemap/es.xml
Sitemap: http://www.mysite.pt/media/sitemap/pt.xml Thanks in advance,0 -
<sub>& <sup>tags, any SEO issues?</sup></sub>
Hi - the content on our corporate website is pretty technical, and we include chemical element codes in the text that users would search on (like S02, C02, etc.) A lot of times our engineers request that we list the codes correctly, with a <sub>on the last number. Question - does adding this code into the keyword affect SEO? The code would look like SO<sub>2</sub>.</sub> Thanks.
Technical SEO | | Jenny10 -
WordPress - How to stop both http:// and https:// pages being indexed?
Just published a static page 2 days ago on WordPress site but noticed that Google has indexed both http:// and https:// url's. Usually I only get http:// indexed though. Could anyone please explain why this may have happened and how I can fix? Thanks!
Technical SEO | | Clicksjim1 -
Generating a signature and expires in java
Hello, I am developing a tool for my company to get stats from SeoMoz using your API. During development, I have been using the example signature and expires values which are auto-generated for me. Now that testing is complete, my code will need to generate these values. I have been googling looking for a resource demonstrating how to do this using Java, but I have not found a good example. I was hoping that someone at SeoMoz would have a resource or an example that they could share. The email associated with this account belongs to a non-developer, so if a response is provided via email in addition to the forum, sending it to my email would be much appreciated. Thank you, Anthony aruffino@ignitemedia.com
Technical SEO | | TRich500 -
How to generate a visual sitemap using sitemap.xml
Are there any tools (online preferably) which will take a sitemap.xml file and generate a visual site map? Seems like an obvious thing to do, but can't find any simple tools for this?
Technical SEO | | k3nn3dy30 -
How to handle sitemap with pages using query strings?
Hi, I'm working to optimize a site that currently has about 5K pages listed in the sitemap. There are not in face this many pages. Part of the problem is that one of the pages is a tool where each sort and filter button produces a query string URL. It seems to me inefficient to have so many items listed that are all really the same page. Not to mention wanting to avoid any duplicate content or low quality issues. How have you found it best to handle this? Should I just noindex each of the links? Canonical links? Should I manually remove the pages from the sitemap? Should I continue as is? Thanks a ton for any input you have!
Technical SEO | | 5225Marketing0