Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Best Dynamic Sitemap Generator
-
Hello Mozers,
Could you please share the best Dynamic Sitemap Generator you are using. I have found this place: http://www.seotools.kreationstudio.com/xml-sitemap-generator/free_dynamic_xml_sitemap_generator.php
Thanks in advanced for your help.
-
I use DYNO Mapper exclusively. It is the best dynamic Sitemap Generator that I have used so far. I have used powermapper & slickplan in the past, but I like DYNO Mapper because it has some cool extra features like content audits capability and inventory display. It is also integrated with Google Analytics and displays this data on each page of the sitemap.
-
Inquiring minds want to know: so how did it go?
-
Thank you very much i appreciate your help. I´m trying gsitecrawler.com, lets see how it goes.
-
I use the free version of xml-sitemaps.com and it works really well - would recommend it.
-
I use the Microsoft SEO toolkit, its not so much dynamic, but you can just click the pages you want included. i think this is better as Duane Farrester said on white board friday, you should not include every page in your sitemap. just the main ones. If you list every page it is likly your site map will be ignored in future.
I asked Duane about this at bing, and he suggested that this is only so with large sites, if you have a small site of 10-20 pages then listing every page is not a problem. but all the same i only list useful pages, i dont list contact page, policy pages and the like.
-
A1 sitemap Generator 3 is good because of it can generate web sitemap, mobile sitemap, rss sitemap etc
-
I use http://gsitecrawler.com/ which is awesome and can be downloaded for free. Does the job for me and doesn't seem to have any limitations such as number of crawled pages.
-
I use xml-sitemaps.com if the website has proper navigation and is less than 500 pages. It works pretty well, you can download the sitemap afterwards in xml, html or txt.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
If I'm using a compressed sitemap (sitemap.xml.gz) that's the URL that gets submitted to webmaster tools, correct?
I just want to verify that if a compressed sitemap file is being used, then the URL that gets submitted to Google, Bing, etc and the URL that's used in the robots.txt indicates that it's a compressed file. For example, "sitemap.xml.gz" -- thanks!
Technical SEO | | jgresalfi0 -
Does Google read dynamic canonical tags?
Does Google recognize rel=canonical tag if loaded dynamically via javascript? Here's what we're using to load: <script> //Inject canonical link into page head if (window.location.href.indexOf("/subdirname1") != -1) { canonicalLink = window.location.href.replace("/kapiolani", ""); } if (window.location.href.indexOf("/subdirname2") != -1) { canonicalLink = window.location.href.replace("/straub", ""); } if (window.location.href.indexOf("/subdirname3") != -1) { canonicalLink = window.location.href.replace("/pali-momi", ""); } if (window.location.href.indexOf("/subdirname4") != -1) { canonicalLink = window.location.href.replace("/wilcox", ""); } if (canonicalLink != window.location.href) { var link = document.createElement('link'); link.rel = 'canonical'; link.href = canonicalLink; document.head.appendChild(link); } script>
Technical SEO | | SoulSurfer80 -
Remove sitemap, effect ranking?
We are considering to remove our sitemap because it doesn't display the right structure. Will it affect current rankings if we remove the sitemap en continuing without a sitemap? Thanks
Technical SEO | | rijwielcashencarry0400 -
Redirecting old Sitemaps to a new XML
I've discovered a ton of 404s from Google's WMT crawler looking for mydomain.com/sitemap_archive_MONTH_YEAR. There are tons of these monthly archive xmls. I've used a plugin that for some reason created individual monthly archive xml sitemaps and now I get 404s. Creating rules for each archive seems a bad solution. My current sitemap plugin creates a single clean one mydomain.com/sitemap_index.xml. How can I create a redirect rule in the Redirection WP plugin that will redirect any URL that has the 'sitemap' and 'xml' string in it to my current xml sitemap? I've tried using a wildcard like so: mysite.com/sitemap*.*, mysite.com/sitemap ., mysite.com/sitemap(.), mysite.com/sitemap (.) but none of the wildcard uses got the general redirect to work. Is there a way to make this happen with the WP Redirection plugin? If not, is there a htaccess rule, and what would the code be for it? Im not very fluent with using general redirects in htaccess unfortunately. Thanks!
Technical SEO | | IgorMateski0 -
Will an XML sitemap override a robots.txt
I have a client that has a robots.txt file that is blocking an entire subdomain, entirely by accident. Their original solution, not realizing the robots.txt error, was to submit an xml sitemap to get their pages indexed. I did not think this tactic would work, as the robots.txt would take precedent over the xmls sitemap. But it worked... I have no explanation as to how or why. Does anyone have an answer to this? or any experience with a website that has had a clear Disallow: / for months , that somehow has pages in the index?
Technical SEO | | KCBackofen0 -
Best Practices for adding Dynamic URL's to XML Sitemap
Hi Guys, I'm working on an ecommerce website with all the product pages using dynamic URL's (we also have a few static pages but there is no issue with them). The products are updated on the site every couple of hours (because we sell out or the special offer expires) and as a result I keep seeing heaps of 404 errors in Google Webmaster tools and am trying to avoid this (if possible). I have already created an XML sitemap for the static pages and am now looking at incorporating the dynamic product pages but am not sure what is the best approach. The URL structure for the products are as follows: http://www.xyz.com/products/product1-is-really-cool
Technical SEO | | seekjobs
http://www.xyz.com/products/product2-is-even-cooler
http://www.xyz.com/products/product3-is-the-coolest Here are 2 approaches I was considering: 1. To just include the dynamic product URLS within the same sitemap as the static URLs using just the following http://www.xyz.com/products/ - This is so spiders have access to the folder the products are in and I don't have to create an automated sitemap for all product OR 2. Create a separate automated sitemap that updates when ever a product is updated and include the change frequency to be hourly - This is so spiders always have as close to be up to date sitemap when they crawl the sitemap I look forward to hearing your thoughts, opinions, suggestions and/or previous experiences with this. Thanks heaps, LW0 -
Does it hurt to have a dynamic counter in your page title?
Currently we work with page titles which display the number of products we have as a counter. This number is highly volatile and can change every day, so that our page title changes all the time. We did this to improve user experience, meet expectations and improve click through rates. Question is whether this can hurt our rankings and if someone has experimented with this or has experience with this?
Technical SEO | | ElmarReizen0 -
Robots.txt Sitemap with Relative Path
Hi Everyone, In robots.txt, can the sitemap be indicated with a relative path? I'm trying to roll out a robots file to ~200 websites, and they all have the same relative path for a sitemap but each is hosted on its own domain. Basically I'm trying to avoid needing to create 200 different robots.txt files just to change the domain. If I do need to do that, though, is there an easier way than just trudging through it?
Technical SEO | | MRCSearch0