What is the best program to create an html sitemap?
-
I already have an xml sitemap, so I've been researching how to create an html sitemap with over 10,000 urls for an ecommerce website. Any program, paid or unpaid, just needs to be created so it looks good to put in the footer of our website.
-
Thanks!
-
Hi Tyler! Did Umar answer your question?
-
Hey Tyler,
I think you need to check this thread for detailed information on HTML site maps and the tools you can use, check it:
https://moz.com/community/q/what-are-benefits-to-develop-large-html-sitemapBy the way, the tool listed in that thread is good, I have used it twice. Though, it's a magento extension.
Hope this will help!
Umar
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Indexed, not submitted in sitemap
I have this problem for the site's blog
Technical SEO | | seomozplan196
There is no problem when I check the yoast plugin setting , but some of my blog content is not on the map site but indexed. Did you have such a problem? What is the cause? my website name is missomister1 -
XML Sitemap Generators
I am looking to use a different sitemap generator that can do 5 thousand or more pages at once. Any recommendations? Thanks guys.
Technical SEO | | Chenzo0 -
Adding multi-language sitemaps to robots.txt
I am working on a revamped multi-language site that has moved to Magento. Each language runs off the core coding so there are no sub-directories per language. The developer has created sitemaps which have been uploaded to their respective GWT accounts. They have placed the sitemaps in new directories such as: /sitemap/uk/sitemap.xml /sitemap/de/sitemap.xml I want to add the sitemaps to the robots.txt but can't figure out how to do it. Also should they have placed the sitemaps in a single location with the file identifying each language: /sitemap/uk-sitemap.xml /sitemap/de-sitemap.xml What is the cleanest way of handling these sitemaps and can/should I get them on robots.txt?
Technical SEO | | MickEdwards0 -
What are the best tools for back links?
I am a new to SEO, please help me in choosing the right tools for back links. I am thinking to buy Ultimate demon, Should I buy it or not? I have a range of you tube videos to rank.
Technical SEO | | Sajiali0 -
Getting a Video Sitemap Indexed
Hi, A client of mine completed a video sitemap to Google Webmaster Tools a couple of months ago. As of yet the videos are still not indexing in Google. All of the videos sit on the one page but have unique URLs in the sitemap. Does anybody know a reason why they are not being indexed? Thanks David
Technical SEO | | RadicalMedia0 -
Best Joomla SEO Extensions?
My website is a Joomla based website. My designer is good, but I don't think he knows that much about SEO . . . so I doubt if he added any extensions that can assist with SEO. I assume there are some good ones that can help my site. Does anyone know what/which Joomla extensions are must haves?
Technical SEO | | damon12120 -
Best blocking solution for Google
Posting this for Dave SottimanoI Here's the scenario: You've got a set of URLs indexed by Google, and you want them out quickly Once you've managed to remove them, you want to block Googlebot from crawling them again - for whatever reason. Below is a sample of the URLs you want blocked, but you only want to block /beerbottles/ and anything past it: www.example.com/beers/brandofbeer/beerbottles/1 www.example.com/beers/brandofbeer/beerbottles/2 www.example.com/beers/brandofbeer/beerbottles/3 etc.. To remove the pages from the index should you?: Add the Meta=noindex,follow tag to each URL you want de-indexed Use GWT to help remove the pages Wait for Google to crawl again If that's successful, to block Googlebot from crawling again - should you?: Add this line to Robots.txt: DISALLOW */beerbottles/ Or add this line: DISALLOW: /beerbottles/ "To add the * or not to add the *, that is the question" Thanks! Dave
Technical SEO | | goodnewscowboy0 -
Whats the best tools for site architecture
Look for tools that can visualise a sites architecture (idealy automated). Also looking for tools that can visualise internal linking sturures
Technical SEO | | Motionlab0