Best server-side xml sitemap generator?
-
I have tried xml-sitemaps which tends to crash when spidering my site(s) and requires multiple manual resumes which aren't practical for our businesses. Please let me know if any other server-side generators that could be used on multiple enterprise-sized websites exist that could be a good fit. Image sitemaps would also be helpfu.l
+++One with multiple starting URLs would help spidering/indexing the most important sections of our sites.
Also, has anyone heard of or used Dyno Mapper? This also looks like a good solution for us, but was wondering if anyone has had any experience with this product.
-
Our company uses a custom CMS for each of our niche classified websites. Ideally, we would like an online/cloud-based platform that would generate our sitemaps for us. Dyno Mapper looks like a good contender, but the $108 monthly fee seems a bit steep.
-
What technology stack is running the websites? Most Cms-based systems have some method of building the xml sitemaps from the database, which is vastly superior to having to build it from a crawl.
P.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Best Practice Approaches to Canonicals vs. Indexing in Google Sitemap vs. No Follow Tags
Hi There, I am working on the following website: https://wave.com.au/ I have become aware that there are different pages that are competing for the same keywords. For example, I just started to update a core, category page - Anaesthetics (https://wave.com.au/job-specialties/anaesthetics/) to focus mainly around the keywords ‘Anaesthetist Jobs’. But I have recognized that there are ongoing landing pages that contain pretty similar content: https://wave.com.au/anaesthetists/ https://wave.com.au/asa/ We want to direct organic traffic to our core pages e.g. (https://wave.com.au/job-specialties/anaesthetics/). This then leads me to have to deal with the duplicate pages with either a canonical link (content manageable) or maybe alternatively adding a no-follow tag or updating the robots.txt. Our resident developer also suggested that it might be good to use Google Index in the sitemap to tell Google that these are of less value? What is the best approach? Should I add a canonical link to the landing pages pointing it to the category page? Or alternatively, should I use the Google Index? Or even another approach? Any advice would be greatly appreciated. Thanks!
Intermediate & Advanced SEO | | Wavelength_International0 -
What is internal like structure best for website
I want to construct website internal like structure better, can you advise me what's model architecture to build menu, navigation, build link, hub content will good for audience and search engine. Thank your advise
Intermediate & Advanced SEO | | dunghv360 -
URL Rewriting Best Practices
Hey Moz! I’m getting ready to implement URL rewrites on my website to improve site structure/URL readability. More specifically I want to: Improve our website structure by removing redundant directories. Replace underscores with dashes and remove file extensions for our URLs. Please see my example below: Old structure: http://www.widgets.com/widgets/commercial-widgets/small_blue_widget.htm New structure: https://www.widgets.com/commercial-widgets/small-blue-widget I've read several URL rewriting guides online, all of which seem to provide similar but overall different methods to do this. I'm looking for what's considered best practices to implement these rewrites. From what I understand, the most common method is to implement rewrites in our .htaccess file using mod_rewrite (which will find the old URLs and rewrite them according to the rewrites I implement). One question I can't seem to find a definitive answer to is when I implement the rewrite to remove file extensions/replace underscores with dashes in our URLs, do the webpage file names need to be edited to the new format? From what I understand the webpage file names must remain the same for the rewrites in the .htaccess to work. However, our internal links (including canonical links) must be changed to the new URL format. Can anyone shed light on this? Also, I'm aware that implementing URL rewriting improperly could negatively affect our SERP rankings. If I redirect our old website directory structure to our new structure using this rewrite, are my bases covered in regards to having the proper 301 redirects in place to not affect our rankings negatively? Please offer any advice/reliable guides to handle this properly. Thanks in advance!
Intermediate & Advanced SEO | | TheDude0 -
Proper naming convention when updating sitemaps
I have a semi-big site (500K pages) with lots of new pages being created. I also have a process that updates my sitemap with all of these pages automatically. I have 11 sitemap files and a sitemap index file. When I update my sitemaps and submit them to Google, should I keep the same names?
Intermediate & Advanced SEO | | jcgoodrich0 -
Best internal linking structure?
We are considering implementing a site-wide contextual linking structure. Does anyone have some good guidelines / blog posts on this topic? Our site is quite (over 1 million pages), so the contextual linking would be automated, but we need to define a set of rules. Basically, if we have a great page on 'healthy recipes,' should we make every instance of the word 'healthy recipes' link back to that page, or should we limit it to a certain number of pages?
Intermediate & Advanced SEO | | nicole.healthline0 -
How to stop pages being crawled from xml feed?
We have a site that has an xml feed going out to many other sites.
Intermediate & Advanced SEO | | jazavide
The xml feed is behind a password protected page so cannot use a cannonical link to point back to original url. How do we stop the pages being crawled on all of the sites using the xml feed? as with hundreds using it after launch it will cause instant duplicate content issues? Thanks0 -
How do I create a XML Sitemap?
It appears that the free online tools limit the number of URLs they'll include. What tools have you had success with?
Intermediate & Advanced SEO | | NaHoku1 -
Best practice to change the URL of all my site pages
Hi, I need to change all my site pages URL as a result of moving the site into another CMS platform that has its own URL structure: Currently the site is highly ranked for all relevant KWs I am targeting. All pages have backlinks Content and meta data should remain exactly the same. The domain should stay the same The plan is as follow: Set up the new site using a temporary domain name Copy over all content and meta data Set up all redirects (301) Update the domain name and point the live domain to the new one Watch closely for 404 errors and add any missing redirects Questions: Any comments on the plan? Is there a way (the above plan or any other) to make sure ranking will not be hurt What entries should I add to the sitemap.xml: new pages only or new pages and the pages from the old site? Thanks, Guy.
Intermediate & Advanced SEO | | jid1