Tool to Generate All the URLs on a Domain
-
Hi all,
I've been using xml-sitemaps.com for a while to generate a list of all the URLs that exist on a domain. However, this tool only works for websites with under 500 URLs on a domain. The paid tool doesn't offer what we are looking for either. I'm hoping someone can help with a recommendation.
We're looking for a tool that can:
- Crawl, and list, all the indexed URLs on a domain, including .pdf and .doc files (ideally in a .xls or .txt file)
- Crawl multiple domains with unlimited URLs (we have 5 websites with 500+ URLs on them)
Seems pretty simple, but we haven't been able to find something that isn't tailored toward management of a single domain or that can crawl a huge volume of content.
-
@PatrickDelehanty The tool mentioned in the statement not only excels in the two areas mentioned earlier but also offers a wide range of additional capabilities. I recommend that you explore it for yourself! Best of luck!
-
@PatrickDelehanty The tool mentioned in the statement not only excels in the two areas ```
mentioned -
It seems to crawl all the wordpress folders and media files.
Is there not a tool that will tell you just your live website URLs, I'm after creating a site map and a mass re-organising content exercise, so want a list in excel of URLs.Any tips welcome
Thanks
Sarah
-
2nd Vote for Screaming Frog. Tried a lot of tools to pull info on all the URL's and this tool is by far the best one for the job.
-
Hi Felicia
Try ScreamingFrog - they crawl the entire site (you can configure how you want it to crawl your site) and have ways of creating a XML Sitemap for you.
The tool goes above and beyond those two areas as well and can do so much. I suggest you check it out! Good luck!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
International URL Structures
Hi everyone! I've read a bunch of articles on the topic, but I can't seem to be able to figure out a solution that works for the specific case. We are creating a site for a service agency, this agency has offices around the world - the site has a global version (in English/French & Spanish) and some country specific versions. Here is where it gets tricky: in some countries, each office has a different version of the site and since we have Canada for example we have a French and an English version of the site. For cost and maintenance reason, we want to have a single domain : www.example.com We want to be able to indicate via Search Console that each subdomain is attached to a different country, but how should we go about it. I've seen some examples with subfolders like this: Global FR : www.example.com/fr-GL Canada FR: www.example.com/fr-ca France: www.example.com/fr-fr Does this work? It seems to make more sense to use : **Subdirectories with gTLDs, **but I'm not sure how that would work to indicate the difference between my French Global version vs. France site. Global FR : www.example.com/fr France : www.example.com/fr/fr Am I going about this the right way, I feel the more I dig into the issue, the less it seems there is a good solution available to indicate to Google which version of my site is geo-targeted to each country. Thanks in advance!
Technical SEO | | sarahcoutu150 -
Will doing a 301 redirect for one domain to another give the latter domain the formers links?
I have some websites that I built a few years ago that are still in existence, but I no longer have access to the sites as they weren't hosted by myself. These sites all carry a "Designed by Me" text on the footer with a link to my (now old) website. I have since done 301 redirects on the domain names that are used in the footers of these sites so they link directly to my new site. However, will these websites now show up on Google Webmasters for example as external links to my site?
Technical SEO | | mickburkesnr0 -
Generating a xml sitemap?
Hi What is everyone's preferred method of generating an XML sitemap? Just wondering if one piece of software is better than others?
Technical SEO | | TheZenAgency1 -
URL redirecting domains
Hi Is there anything wrong/dangerous forwarding a clutch of domains to a sub page (landing page) on a different domain ? Say Brand X buys Brand Z and wants to close down Brand Z site but have Brand Z domain fwd to a landing page (explaining the company acquisition) on Brand X site. In addition Brand Z had a few related but unused domains forwarding to Brand Z doman & now also wants those fwd'd to the new landing page on brand X Since the reasons for doing this forwarding are legitimate company reasons relating to an acquisition i would have thought it should be ok but can anyone think of a reason why could be bad since i remember in the old days peeps used to redirect domains for seo reasons so worried fwd'ing a load of domains could cause some sort of negative flag with big G ? Also do domain redirects transfer the authority/juice from the old site/domain to the new destination page (new landing page on brand x site) similar to how a 301 redirect works ? Many Thanks Dan
Technical SEO | | Dan-Lawrence0 -
Domain Forwarding / Multiple Domain Names / or Rebuild Blogs on them
I am considering forwarding 3 very aged and valuable domain names to my main site. There were once over 100 blog posts on each blog and each one has a page authority of 45 and domain authority of 37. My question is should i put up three blogs on the domains and link them to my site or should i just forward the domains to my main site? Which will provide me with more value. I have the capability to have some one blog on them every day. However, i do not have access to any of the old blog posts. I guess i could scrape it of archive.org. Any advice would be appreciated. Scott
Technical SEO | | WindshieldGuy-2762210 -
Transfer a Main Domain to a Sub-Domain
My IT department tells me they want to transfer my main site domain, which has been in existence since 1999 as an e-commerce site (maindomain.com) to a sub-domain (www2.maindomain.com) or a completely new domain (newdomain.net). This is because we are launching a new website and B2C e-commerce engine, but we still have to maintain the legacy B2B e-commerce engine which contains hard-coded URLs, and both systems can't use the same domain. I've been researching the issue across SEOmoz, but I haven't come across this exact type of scenario (mostly I've seen a sub-domain to new domain). I see major problems with their proposal, including negative SEO impact, loss of domain authority/ranking and issues with branding. Does anyone know the exact type of impact I can expect to see in this scenario and specific steps I should go about to minimize the impact? Btw, I will be using Danny Dover's guide on properly moving domains where appropriate. Thanks!
Technical SEO | | AscendLearning0 -
Is there actual risk to having multiple URLs that frame in main url? Or is it just bad form and waste of money?
Client has many urls that just frame in the main site. It seems like a total waste of money, but if they are frames, is there an actual risk?
Technical SEO | | gravityseo0 -
Redirecting root domains to sub domains
Mozzers: We have a instance where a client is looking to 301 a www.example.com to www.example.com/shop I know of several issues with this but wondered if anyone could chip in with any previous experiences of doing so, and what outcomes positive and negative came out of this. Issues I'm aware of: The root domain URL is the most linked page, a HTTP 301 redirect only passes about 90% of the value. you'll loose 10-15% of your link value of these links. navigational queries (i.e.: the "domain part" of "domain.tld") are less likely to produce google site-links less deep-crawling: google crawls top down - starts with the most linked page, which will most likely be your domain url. as this does not exist you waste this zero level of crawling depth. robots.txt is only allowed on the root of the domain. Your help as always is greatly appreciated. Sean
Technical SEO | | Yozzer0