Sitemap Generator Tool
-
We have developed a very large domain with well over 500 pages that need to be indexed. The tool we usually use to create a sitemap has a limit of 500 pages. Does anyone know of good tool we can use to create a sitemap text and xml that doesn't have a limit of pages?
Thanks!
-
Try the DYNO Mapper sitemap generator. It creates a visual sitemap and displays content inventory, Google Analytics integration, and lets you collaborate with other users.
-
Hi Reggie,
You should try out http://www.web-site-map.com/ which 3500+ pages XML sitemap.
-
Would that tool happen to be Screaming Frog? The full version of Screaming Frog let's you create a sitemap of 49,999 pages. See the bottom of this page.
An individual license costs £99 for the year, but I'll tell you know it's worth every single penny about 5 times over. Couldn't live without the tool - here's a list of its other features if you're unaware.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Getting 'Indexed, not submitted in sitemap' for around a third of my site. But these pages ARE in the sitemap we submitted.
As in the title, we have a site with around 40k pages, but around a third of them are showing as "Indexed, not submitted in sitemap" in Google Search Console. We've double-checked the sitemaps we have submitted and the URLs are definitely in the sitemap. Any idea why this might be happening? Example URL with the error: https://www.teacherstoyourhome.co.uk/german-tutor/Egham Sitemap it is located on: https://www.teacherstoyourhome.co.uk/sitemap-subject-locations-surrey.xml
Technical SEO | | TTYH0 -
Duplicate Meta Titles and Descriptions Issue in Google Webmaster Tool
Hello All, We have one site named http://www.bargains-online.com.au/ & have some categories along with filter option on left side like filter by price & by brand, ect. We have already set rel canonical tags on all filtered pages, but still those all pages showing duplicate page titles and description warning in HTML Improvements section in Google Webmaster Tool. For Example: http://www.bargains-online.com.au/pressure-cleaners.html We've set rel canonical tag on below pages. http://www.bargains-online.com.au/pressure-cleaners/l/manufacturer:black-eagle.html http://www.bargains-online.com.au/pressure-cleaners/l/price:2,100.html http://www.bargains-online.com.au/pressure-cleaners/l/price:3,100.html Kindly request if anybody has any solutions for the same, please share with us. Thanks, Akshay
Technical SEO | | akshaydesai0 -
Why xml generator is not detecting all my urls?
Hi Mozzers, After adding 3 new pages to example.com, when generating the xml sitemap, Iwasn't able to locate those 3 new url. This is the first time it is happening. I have checked the meta tags of these pages and they are fine. No meta robots setup! Any thoughts or idea why this is happening? how to fix this? Thanks!
Technical SEO | | Ideas-Money-Art0 -
Duplicate Titles and Sitemap rel=alternate
Hello, Does anyone know why I still have duplicate titles after crawling with moz (also google webmasters shows the same) even after I implemented (since 1 week or 2) a new sitemap with rel=alternate attribute for languges? In fact, the duplicates should be in the titles like http://socialengagement.it/su-di-me and http://socialengagement.it/en/su-di-me. The sitemap is on socialengagement.it/sitemap.xml (please note formatting somehow does not show correctly, you should see the source code to double check if its done properly. Was made by hand by me). Thanks for help! Eugenio
Technical SEO | | socialengaged0 -
Site (Subdomain) Removal from Webmaster Tools
We have two subdomains that have been verified in Google Webmaster Tools. These subdomains were used by 3rd parties which we no longer have an affiliation with (the subdomains no longer serve a purpose). We have been receiving an error message from Google: "Googlebot can't access your site. Over the last 24 hours, Googlebot encountered 1 errors while attempting to retrieve DNS information for your site. The overall error rate for DNS queries for your site is 100.00%". I originally investigated using Webmaster Tools' URL Removal Tool to remove the subdomain, but there are no indexed pages. Is this a case of simply 'deleting' the site from the Manage Site tab in the Webmaster Tools interface?
Technical SEO | | Cary_PCC0 -
NoIndex user generated pages?
Hi, I have a site, downorisitjustme (dot) com It has over 30,000 pages in google which have been generated by people searching to check if a specific site is working or not and then possibly adding a link to a msg board to the deeplink of the results page or something which is why the pages have been picked up. Am I best to noindex the res.php page where all the auto generated content is showing up and just have the main static pages as the only ones available to be indexed?
Technical SEO | | Wardy0 -
Schema Markup and Google's Rich Snippet Tool
Has anyone ever used the snippet tool and gotten the following error "could not fetch website"? When using the tool and placing an url that does not have markup present it will show that as the error. Or if part of markup is wrong, it will diagnose it accordingly. Did a search online and found limited info...one of which someone had this error but when other users tested it, they were not getting the same error.
Technical SEO | | andrewv0 -
Issue with 'Crawl Errors' in Webmaster Tools
Have an issue with a large number of 'Not Found' webpages being listed in Webmaster Tools. In the 'Detected' column, the dates are recent (May 1st - 15th). However, looking clicking into the 'Linked From' column, all of the link sources are old, many from 2009-10. Furthermore, I have checked a large number of the source pages to double check that the links don't still exist, and they don't as I expected. Firstly, I am concerned that Google thinks there is a vast number of broken links on this site when in fact there is not. Secondly, why if the errors do not actually exist (and never actually have) do they remain listed in Webmaster Tools, which claims they were found again this month?! Thirdly, what's the best and quickest way of getting rid of these errors? Google advises that using the 'URL Removal Tool' will only remove the pages from the Google index, NOT from the crawl errors. The info is that if they keep getting 404 returns, it will automatically get removed. Well I don't know how many times they need to get that 404 in order to get rid of a URL and link that haven't existed for 18-24 months?!! Thanks.
Technical SEO | | RiceMedia0