How to extract URLs from a site (without bringing the server down!)
-
Hi everybody.
One of my clients is migrating to a new ecommerce platform, and we need to get a list of urls from the existing site to start mapping out the 301 redirects. Usually, I'd use a tool like Xenu or Integrity to crawl and output a list.
However, the database and server setup is so bad that it can't handle the requests from these tools and it sends the site down. This, unsurprisingly, is one of the reasons for the migration.
Does anybody know of a way to get a full list of urls without having to make a bunch of http requests which will kill the site? Any advice would be much appreciated!
-
Just a follow-up to my endorsement. It looks like Screaming Frog will let you control the number of pages crawled per second, but to do a full crawl you'll need to get the paid version (the free version only crawls 500 URLs):
http://www.screamingfrog.co.uk/seo-spider/
It's a good tool, and nice to have around, IMO.
-
Copy the site, set it up on a staging server and run http://www.xml-sitemaps.com/ on it?
-
why not find the links to the site, becauase you will only need to 301 the urls with extenal links. let teh rest 404. i use Bing WMT as it has a most complete collection IMO. they also export to a csv
-
Thanks Yannick, I don't know why I didn't think of using a scraper! Can you recommend any good code (PHP perhaps)?
-
-
Scrape Google?
-
Make your own scraper and keep the requests per second really low ?
-
Maybe the site has an automated sitemap somewhere ?
-
Google webmaster tools -> download "internal links" table
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Our clients Magento 2 site has lots of obsolete categories. Advice on SEO best practice for setting server level redirects so I can delete them?
Our client's Magento website has been running for at least a decade, so has a lot of old legacy categories for Brands they no longer carry. We're looking to trim down the amount of unnecessary URL Redirects in Magento, so my question is: Is there a way that is SEO efficient to setup permanent redirects at a server level (nginx) that Google will crawl to allow us at some point to delete the categories and Magento URL Redirects? If this is a good practice can you at some point then delete the server redirects as google has marked them as permanent?
Technical SEO | | WillyGx0 -
Messy older site
I am taking over a website that doesn't have any canonical tags and spotty redirects. It looks like they have http://, https://, www and non-www pages indexed but GA is just set up for the http://non-www home page. Should all versions of the site be set up in GA and Search Console? I think so but wanted to confirm. Thanks in advance.
Technical SEO | | SpodekandCo0 -
Exclude URL Parameters?
Hello, I am new in SEO and I am trying to understand the basics in URL parameters. Let’s assume that I have an ecommerce site with Categories (Category1, Category2) Views (listview=1, listview=2) Orders (OrderBy=1, OrderBy=2) Pages (pg=1, pg=2) Why should I add google to index pages with different views and Listing orders? What is the benefit for the site to have the same content with different order? I am not sure but maybe only need pages in order to google to “travel” between the pages? For example: www.mydomain.com/books/pg=1 www.mydomain.com/books/?order=date www.mydomain.com/books/?listview =1 The products in pages (pg) will always include products in order and listview? Why should google index again the content? Furthermore, from the last time that google index the pg=1 the products has changed. Thank you in advanced
Technical SEO | | ArisGast0 -
How to link site.com/blog or site.com/blog/
Hello friends, I have a very basic question but I can not find the right answer... I have made my blog linkbuilding using the adress "mysite.com/blog" but now im not sure if is better to do the linkbuilding to "mysite.com**/blog/ "** Is there any diference? Thanks...
Technical SEO | | lans27870 -
What is wrong with my site?
I have been working hard for over two months on my sites in seomoz and have seen some nice results in some (www.etraxc.com/ and www-my-etraxc.com for instance. Still I am really frustrated by www.classroomconnection.us/. I cant even get on the first page with the search term "classroom connection." i would love some help on this one. On a related note, does it help to have links to YouTube videos about the content? If so, how do I ensure that this piece is working well for me? Thanks a ton!
Technical SEO | | bobbabuoy0 -
Site Crawl
I was wondering if there was a way to use SEOmoz's tool to quickly and easily find all the URLs on you site and not just the ones with errors. The site that I am working on does not have a site map. What I am trying to do is find all the URLs along with their titles and description tags. Thank you very much for your help
Technical SEO | | pakevin0 -
Dynamic Parameters in URL
I have received lots of warnings because of long urls. Most of them are because my website has many Attributes to FILTER out products. And each time the user clicks on one, its added to the URL. pls see my site here: www.theprinterdepo.com The warning is here: Although search engines can crawl dynamic URLs, search engine representatives have warned against using over 2 parameters in any given URL. The question to the community is: -What should I do? These attributes really help the user to find easier the products. I could remove some of the attributes, I am not sure if my ecommerce solution (MAGENTO), allows to change the behavior of this so that this does not use querystring parameters.
Technical SEO | | levalencia10 -
Site Architecture Trade Off
Hi All I'm looking for some feedback regarding a site architecture issue I'm having with a client. They are about to enter a re-design and as such we're restructuring the site URLs and amending/ adding pages. At the moment they have ranked well off the back of original PPC landing pages that were added onto the site, such as www.company.com/service1, www.company.com/service2, etc The developer, from a developer point of view wished to create a logical site architecture with multiple levels of directories etc. I've suggested this probably isn't the best way to go, especially as the site isn't that large (200-300 pages) and that the key pages we're looking to rank should be as high up the architecture as we can make them, and that this amendment could hurt their current high rankings. It looks like the trade off may be that the client is willing to let some pages be restructured so for example, www.company.com/category/sub-category/service would be www.company.com/service. However, although from a page basis this might be a solution, is there a drawback to having this in place for only a few pages rather than sitewide? I'm just wondering if these pages might stick out like a sore thumb to Google.
Technical SEO | | PerchDigital1