Best XML Sitemap Generator for Mac?
-
Hi all,
Recently moved from PC to Mac when starting a new job. One of the things I'm missing from my PC is G Site Crawler, and I haven't yet found a decent equivalent for the Mac.
Can anybody recommend something as good as G Site Crawler for the Mac? I.e. I need the flexibility to exclude by URL parameter etc etc.
Cheers everyone,
Mark
-
Thanks Crimson Penguin (really hope that's your real name!)
I have used sitemapdoc.com in the past, only problem is it limits you to 500 URLs. Really wish G Site would do a Mac version - same with Xenu, there's just nothing out there to do a job as well as those two on the Mac.
Cheers for your feedback - hoping somebody else can come up with something golden?
-
Of course I should also add, if you really want G Site Crawler, you could run a virtual windows on your Mac and carry on using it that way. I run windows on my mac using VMware Fusion:
-
One of my favourites is an online generator called sitemapdoc:
I believe it has all the features you are looking for.
Adam.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Best way to fix duplicate content issues
Another question for the Moz Community. One of my clients has 4.5k duplicate content issues. For example: http://www.example.co.uk/blog and http://www.example.co.uk/index.php?route=blog/blog/listblog&year=2017. Most of the issues are coming from product pages. My initial thoughts are to set up 301 redirects in the first instance and if the issue persists, add canonical tags. Is this the best way of tackling this issue?
Technical SEO | | Laura-EMC0 -
Dynamic Url best approach
Hi We are currently in the process of making changes to our travel site where by if someone does a search this information can be stored and also if the user needs to can take the URL and paste into their browser at find that search again. The url will be dynamic for every search, so in order to stop duplicate content I wanted ask what would be the best approach to create the URLS. ** An example of the URL is: ** package-search/holidays/hotelFilters/?depart=LGW&arrival=BJV&sdate=20150812&edate=20150819&adult=2&child=0&infant=0&fsearch=first&directf=false&nights=7&tsdate=&rooms=1&r1a=2&r1c=0&r1i=0&&dest=3&desid=1&rating=&htype=all&btype=all&filter=no&page=1 I wanted to know if people have previous experience in something like this and what would be the best option for SEO. Will we need to create the URL with a # ( As i read this stops google crawling after the #) Block the folder IN ROBOTS is there any other areas I should be aware of in order stop any duplicate content and 404 pages once the URL/HOLIDAY SEARCH is no longer valid. thanks E
Technical SEO | | Direct_Ram0 -
Will an XML sitemap override a robots.txt
I have a client that has a robots.txt file that is blocking an entire subdomain, entirely by accident. Their original solution, not realizing the robots.txt error, was to submit an xml sitemap to get their pages indexed. I did not think this tactic would work, as the robots.txt would take precedent over the xmls sitemap. But it worked... I have no explanation as to how or why. Does anyone have an answer to this? or any experience with a website that has had a clear Disallow: / for months , that somehow has pages in the index?
Technical SEO | | KCBackofen0 -
Noob 101 - Sitemaps
Hi guys, looking for some sitemap help. I'm running two seperate systems so my auto-generated sitemap on the main system has a few holes in it. I'd like to submit this to webmaster anyway, and then plug the holes with missing pages by adding them to 'Fetch as Google'. Does that make sense or will Google ignore one of them? Many thanks, Idiot
Technical SEO | | uSwSEO0 -
Best TLD for china
In China there are 2 commonly used tlds .cn and .com.cn. We own both versions for a new domain. Does anyone know if there is research done which one is the best TLD "in the eyes" of the search engines Baidu and Google? Or maybe there is a methodology to select the best? Thanks!
Technical SEO | | Paul-G0 -
What to do with content that performs well in SERPs, but is dynamically generated?
A new client developed an application that generates dynamic content. They were hit hard from Panda, and I believe it is in part due to this application. About 500 of the urls from this application perform well in SERPs (rank well, drive traffic to the site, low bounce rate, high page views per visit, etc). And there are an additional 9,000 urls (and growing) in the index that don't drive any organic traffic. We are thinking of making the 500 url that perform well into static pages and de-indexing the rest. What are your thoughts on this?
Technical SEO | | nicole.healthline0 -
How do i Organize an XML Sitemap for Google Webmaster Tools?
OK, so i used am xlm sitemap generator tool, xml-sitemaps.com, for Google Webmaster Tools submission. The problem is that the priorities are all out of wack. How on earth do i organize it with 1000's of pages?? Should i be spending hours organizing it?
Technical SEO | | schmeetz0 -
Best Joomla SEO Extensions?
My website is a Joomla based website. My designer is good, but I don't think he knows that much about SEO . . . so I doubt if he added any extensions that can assist with SEO. I assume there are some good ones that can help my site. Does anyone know what/which Joomla extensions are must haves?
Technical SEO | | damon12120