Dynamic XML Sitemap Generator
-
Has anyone used a Dynamic XML Sitemap Generator tool? Looking for recommendations!
-
Are you running a CMS? If so maybe it has one built-in or available as a plugin - for example the Yoast SEO plugin on WordPress will create a sitemap, has plenty of inclusion & exclusion rules, (including a per-page override), and helps you do a bunch of other stuff too. I'm most other CMSs have something available by now.
If it's a custom-built dynamic site maybe your development team, (or developer), can add sitemap functionality. If you have a lot of pages, backed by a custom database, the best way to make sure your sitemap is up to date will be to generate it based on your data, not let some 3rd-party tool follow links around your site - Google can do that already!
If you are not running a CMS, or a custom app, how many pages do you have? It might be easiest to write the XML sitemap by hand, (although I'm a developer, so that's easy for me to say), or try out Logan's recommendation.
-
Hi Chris,
Screaming Frog has been my go-to XML sitemap generator for years. Plenty of customization options for exclusions and inclusions.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google News Sitemap creating service
Hi All, I am dealing with google news sitemap. My technical guy don't know how to create a site for google news. Do you know which service or company can help me with this? Thanks a lot!
Intermediate & Advanced SEO | | binhlai0 -
Multilingual Sitemaps
Hey there, I have a site with many languages. So here are my questions concerning the sitemaps. The correct way of creating a sitemap for a multilingual site is as followed ( by the official blog of Google ) <urlset xmlns="</span>http://www.sitemaps.org/schemas/sitemap/0.9" xmlns:xhtml="http://www.w3.org/1999/xhtml"> http://www.example.com/loc> <xhtml:link rel="alternate" hreflang="en" href="</span>http://www.example.com/"/> <xhtml:link rel="alternate" hreflang="de" href="</span>http://www.example.com/de"/> <xhtml:link rel="alternate" hreflang="fr" href="</span>http://www.example.com/fr"/><a href=" http:="" www.example.com="" fr"="" target="_blank"></xhtml:link><a href=" http:="" www.example.com="" de"="" target="_blank"></xhtml:link><a href=" http:="" www.example.com="" "="" target="_blank"></xhtml:link><a href=" http:="" www.sitemaps.org="" schemas="" sitemap="" 0.9"="" rel="nofollow" target="_blank"></urlset> **So here is my first question. My site has over 200.000 pages that all of them support around 5-6 languages. Am I suppose to do this example 200.000 times?****My second question is. My root domain is www.example.com but this one redirects with 301 to www.example.com/en should the sitemap be at ****www.example.com/sitemap.xmlorwww.example.com/en/sitemap.xml ???****My third question is as followed. On WMT do I submit my sitemap in all versions of my site? I have all my languages there.**Thanks in advance for taking the time to respond to this thread and by creating it I hope many people will solve their own questions.
Intermediate & Advanced SEO | | Angelos_Savvaidis0 -
How to avoid keyword stuffing in dynamic pages?
Our new home page which is in development has been identified as being keyword stuffed for a particular search word. The problem is that the page includes a dynamic feed pulled in from our database. It would be similar to booking.com for example coming up as keyword stuffed for the word hotel. But hotels are their business and so any instance of the word hotel is probably relevant. Our problem is similar. How detrimental would this be for SEO? And does anyone have any ideas how this can be worked round?
Intermediate & Advanced SEO | | striple0 -
Sitemaps
I am working with a site that has sitemaps broken down very specifically. By page type: article, page etc and also broken down by Category. Unfortunately, this is not done hierarchically. Category and page type are separate maps, they are not nested. My question here is: Is is detrimental to have two separate sitemaps that point to the same pages? Should we eliminate one of these taxonomies, or maybe just try to make them hierarchical? IE item type -> category -> pagetitle Is there an issue with having a sitemap index that points to a nested sitemap index? (I dont think so, but might as well be sure. Thanks Moz Community! Can't delete my question, but turns out that isn't how they are structured. Food for thought anyway I suppose.
Intermediate & Advanced SEO | | MarloSchneider0 -
Dynamically change anchor text and URLs remotely
Hey i'm looking to create a widget in javascript which i dynamically change the urls and anchor text which link the widget back to my site remotely (via php) once it spreads. I have heard peopled doing this before, but i can't seem to find a example. Does anyone know of any examples/widgets or anything which can do this?
Intermediate & Advanced SEO | | monster990 -
XML Sitemap Index Percentage (Large Sites)
Hi all I'm wanting to find out from those who have experience dealing with large sites (10s/100s of millions of pages). What's a typical (or highest) percentage of indexed pages vs. submitted pages you've seen? This information can be found in webmaster tools where Google shows you the pages submitted & indexed for each of your sitemap. I'm trying to figure out whether, The average index % out there There is a ceiling (i.e. will never reach 100%) It's possible to improve the indexing percentage further Just to give you some background, sitemap index files (according to schema.org) have been implemented to improve crawl efficiency and I'm wanting to find out other ways to improve this further. I've been thinking about looking at the URL parameters to exclude as there are hundreds (e-commerce site) to help Google improve crawl efficiency and utilise the daily crawl quote more effectively to discover pages that have not been discovered yet. However, I'm not sure yet whether this is the best path to take or I'm just flogging a dead horse if there is such a ceiling or if I'm already at the average ballpark for large sites. Any suggestions/insights would be appreciated. Thanks.
Intermediate & Advanced SEO | | danng0 -
Generating 404 Errors but the Pages Exist
Hey I have recently come across an issue with several of a sites urls being seen as a 404 by bots such as Xenu, SEOMoz, Google Web Tools etc. The funny thing is, the pages exist and display fine. This happens on many of the pages which use the Modx CMS, but the index is fine. The wordpress blog in /blog/ all works fine. The only thing I can think of is that I have a conflict in the htaccess, but troubleshooting this is difficult, any tool I have found online seem useless. Have tried to rollback to previous versions but still does not work. Anyone had any experience of similar issues? Many thanks K.
Intermediate & Advanced SEO | | Found0