What is the best tool to crawl a site with millions of pages?
-
I want to crawl a site that has so many pages that Xenu and Screaming Frog keep crashing at some point after 200,000 pages.
What tools will allow me to crawl a site with millions of pages without crashing?
-
Don't forget to exclude pages that don't contain the information you are looking for - exclude query parameters which just result in duplicate content, system files, etc. That may help to bring the amount down.
-
Only basic stuff: URL, Title, Description, and a few HTML elements.
I am aware that building a crawler would be fairly easy, but is there one out there that already does it without consuming too many resources?
-
For what purpose do you want to crawl the site?
A web crawler isn't really hard to write. In 100 lines of code you can probably code one. The question is of course: what do you want out of the crawl?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Redirecting an Entire Site to a Page on Another Site?
So I have a site that I want to shut down http://vowrenewalsmaui.com and redirect to a dedicated Vow Renewals page I am making on this site here: https://simplemauiwedding.net. My main question is: I don't want to lose all the authority of the pages and if I just redirect the site using my domain registrar's 301 redirect it will only redirect the main URL not all of the supporting pages, to my knowledge. How do I not lose all the authority of the supporting pages and still shut down the site and close down my site builder? I know if I leave the site up I can redirect all of the individual pages to corresponding pages on the other site, but I want to be done with it. Just trying to figure out if there is a better way than I know of. The domain is hosted through GoDaddy.
Intermediate & Advanced SEO | | photoseo10 -
Old site penalised, we moved: Shall we cut loose from the old site. It's curently 301 to new site.
Hi, We had a site with many bad links pointing to it (.co.uk). It was knocked from the SERPS. We tried to manually ask webmasters to remove links.Then submitted a Disavow and a recon request. We have since moved the site to a new URL (.com) about a year ago. As the company needed it's customer to find them still. We 301 redirected the .co.uk to the .com There are still lots of bad links pointing to the .co.uk. The questions are: #1 Do we stop the 301 redirect from .co.uk to .com now? The .co.uk is not showing in the rankings. We could have a basic holding page on the .co.uk with 'we have moved' (No link). Or just switch it off. #2 If we keep the .co.uk 301 to the .com, shall we upload disavow to .com webmasters tools or .co.uk webmasters tools. I ask this because someone else had uploaded the .co.uk's disavow list of spam links to the .com webmasters tools. Is this bad? Thanks in advance for any advise or insight!
Intermediate & Advanced SEO | | SolveWebMedia0 -
Robots.txt - Do I block Bots from crawling the non-www version if I use www.site.com ?
my site uses is set up at http://www.site.com I have my site redirected from non- www to the www in htacess file. My question is... what should my robots.txt file look like for the non-www site? Do you block robots from crawling the site like this? Or do you leave it blank? User-agent: * Disallow: / Sitemap: http://www.morganlindsayphotography.com/sitemap.xml Sitemap: http://www.morganlindsayphotography.com/video-sitemap.xml
Intermediate & Advanced SEO | | morg454540 -
Adding hreflang tags - better on each page, or the site map?
Hello, I am wondering if there seems to be a preference for adding hreflang tags (from this article). My client just changed their site from gTLDs to ccTLDs, and a few sites have taken a pretty big traffic hit. One issue is definitely the amount of redirects to the page, but I am also going to work with the developer to add hreflang tags. My question is - is it better to add them to the header of each page, or the site map, or both, or something else? Any other thoughts are appreciated. Our Australia site, which was at least findable using Australia Google before this relaunch, is not showing up, even when you search the company name directly. Thanks!Lauryn
Intermediate & Advanced SEO | | john_marketade0 -
Google webmaster tools showing "no data available" for links to site, why?
In my google webmaster account I'm seeing all the data in other categories except links to my site. When I click links to my site I get a "no data available" message. Does anyone know why this is happening? And if so, what to do to fix it? Thanks.
Intermediate & Advanced SEO | | Nicktaylor10 -
How To Best Close An eCommerce Site?
We're closing down one of our eCommerce sites. What is the best approach to do this? The site has a modest link profile (a young site). It does have a run of site link to the parent site. It also has a couple hundred email subscribers and established accounts. Is there a gradual way to do this? How do I treat the subscribers and account holders? The impact won't be great, but I want to minimize collateral damage as much as possible. Thanks.
Intermediate & Advanced SEO | | AWCthreads0 -
How to link back to our main site from landing pages without getting penalized
I work for a small family insurance agency in CA and I am trying to learn how to compete in this extremely competitive industry. One of the ideas we had was to purchase all the long-tail keyword urls we could and use them as landing pages to direct traffic back to our primary site. (ex. autoinsurancecity.com). Our thought was that we could put landing pages on each that looked almost identical to the main page and use the navigation in the landing pages as links to direct traffic to the applicable category pages on the main site. (Ex. autoinsurancecity.com -> mainpage.com/auto-insurance). My concern is that I want to make sure we don't tick off Google. Implementing this strategy would result in each of the category pages getting lots of links from the landing page navigation very quickly. I don't think the links will be worth much from an SEO perspective, but I don't want them to look like spam either. Any suggestions on if this sort of tactic would put us at risk of being penalized? If so, does anyone have any suggestions on a better way to implement a strategy like this? Thank you in advance for the help! I'm totally new to this and any advice goes a long way!
Intermediate & Advanced SEO | | matthewbyers0 -
How Fast Is Too Fast to Increase Page Volume of Your Site
I am working on a project that is basically a site to list apartment for rent (similar to apartments.com or rent.com). We want to add a bunch of amenity pages, price pages, etc. Basically increasing the page count on the site and helping users be able to have more pages relevant to their searches and long tail phrases. So an example page would be Denver apartments with a pool would be one page and Seattle apartments under 900 would be another page etc. By doing this we will take the site from about 14,000 pages or so to over 2 million by the time we add a list of amenities to every city in the US. My question is should I worry about time release on them? Meaning do you think we would get penalized for launching that many pages overnight or over the course of a week? How fast is too fast to increase the content on your site? The site about a year old and we are not trying to scam anything just looking to site functionality and page volume. Any advice?
Intermediate & Advanced SEO | | ioV0