Chase Lochmiller, CEO and Co-founder of Crusoe: Live Special at MCJ Summit

Chase Lochmiller is the CEO and co-founder of Crusoe. If you’re a regular listener, Crusoe isn’t new to the pod. This summer, Cody sat down with Chase’s Co-founder and COO, Cully Cavness, during our live event in Austin.

This latest episode was recorded live at the inaugural MCJ Summit in San Francisco at the beautiful Autodesk Gallery. Cody and Chase dive into how Crusoe is building data centers at the intersection of AI and energy. Chase traces his path from MIT soccer captain and mountaineer to climate-focused entrepreneur, and how those experiences shaped Crusoe’s core values of preparation, curiosity, and speed.

He shares the story behind the company’s 1.2-gigawatt Abilene, TX project, its energy-first approach to powering AI infrastructure, and his vision for an era of abundant energy and intelligence. The discussion also explores the future of AI labor, grid integration, and what digital abundance could mean for society at large.

Special thanks to our MCJ Summit attendees and our kind sponsors: Autodesk Foundation, Borusan, Cedar Grove, CSC Leasing, Mitsui O.S.K. Lines, Obayashi, Palantir, and Safire Partners.

Episode recorded on Oct 15, 2025 (Published on Oct 29, 2025)


In this episode, we cover:

  • ⁠[01:14] ⁠Chase’s early love of math, science, and soccer

  • ⁠ [02:42] ⁠Realizing academia moved too slow for his energy

  • ⁠ [04:32] ⁠How his entrepreneurial father shaped his path

  • ⁠ [05:05] ⁠Climbing Everest and the origins of “Think Like a Mountaineer”

  • ⁠ [09:32] ⁠Defining Crusoe as a clean AI infrastructure company

  • ⁠ [10:47] ⁠Building vertically integrated “AI factories”

  • ⁠ [16:24] ⁠Inside the 1.2 GW Abilene project for OpenAI and Oracle

  • ⁠ [20:52] ⁠Crusoe’s energy-first approach to compute build-outs

  • ⁠ [25:36] ⁠Using AI demand to accelerate next-gen energy solutions

  • ⁠ [30:24] ⁠When AI becomes a power orchestrator

  • ⁠ [33:31] ⁠Digital labor and AI’s impact on GDP and society

  • ⁠ [38:41] ⁠How Chase hopes Crusoe is remembered in 30 years


  • Cody Simms (00:00):

    From MCJ, I'm Cody Simms, and this is Inevitable. Climate change is inevitable. It's already here, but so are the solutions shaping our future. Join us every week to learn from experts and entrepreneurs about the transition of energy and industry.

    (00:28):

    Today on Inevitable, we are live at the inaugural MCJ Summit at the beautiful Autodesk Gallery in Downtown San Francisco, and our guest is Chase Lochmiller, CEO and co-founder of Crusoe. Crusoe is a vertically integrated clean AI infrastructure company, and we have a ton to talk about. Chase, welcome.

    Chase Lochmiller  (00:49):

    Thank you.

    Cody Simms (00:51):

    So I wanted to start by really understanding you, Chase, and digging into how you ultimately built this business at Crusoe, but going in the wayback machine and maybe sharing a little bit about where you grew up and what kinds of things you were interested in as a kid. Because AI infrastructure businesses were not a thing when you were a kid. It's a new category.

    Chase Lochmiller  (01:14):

    That's right. Data centers were scarcely a thing honestly when I was growing up. But I grew up in Denver, Colorado. Cully, my co-founder and I, actually went to high school together in Denver at a school called Kent Denver. I think growing up I was always very interested in math and science. I was very drawn to math and math competitions. I was a very competitive kid. I played a lot of sports, loved soccer. We got the World Cup coming here next year. I have very fond memories of the '94 World Cup and just following that closely, and that helped inspire me to just fall in love with the game of soccer. Ended up playing soccer through college, played at MIT. And actually, there's people from every chapter of my life that have been part of Crusoe, and there's a number of kids that were on the MIT soccer team that I actually work at Crusoe, including our CTO Nitin.

    Cody Simms (01:57):

    Oh, no way. I didn't know that. That's neat.

    Chase Lochmiller  (01:59):

    Yeah, Nitin was our sweeper. He's a great-

    Cody Simms (02:01):

    What was your role?

    Chase Lochmiller  (02:02):

    I was a center midfielder.

    Cody Simms (02:03):

    Of course you were.

    Chase Lochmiller  (02:05):

    CEO of the soccer team. I originally went to MIT. I was convinced I was going to become a theoretical physicist and help unlock the mysteries of the universe and expand the human knowledge set. I spent a lot of time doing various research things and did some research at MIT, did some research at Los Alamos National Lab. And I think one of my realizations during that era was that basic science research just moves pretty slowly, and I just had a ton of energy. Maybe today that's changing with AI, but certainly at that point it just felt like a very slow, drawn out path. I felt like I wanted to do something faster paced.

    Cody Simms (02:42):

    Did you think originally you wanted to be in academia? Was that the path you were on?

    Chase Lochmiller  (02:46):

    Yeah. I think that was the path, academia or applied-

    Cody Simms (02:49):

    Like a lab or something.

    Chase Lochmiller  (02:49):

    Exactly. I mean, ITER had just been announced, so this big international collaboration to build this giant tokamak to hopefully have a big breakthrough in fusion. I remember my first day working at Los Alamos National Lab and my principal investigator I was working under, he told me, "Chase, this is a really exciting time for fusion and plasma physics, and hopefully in 30 years, we're actually going to have fusion." And I was like, "Wow, that's pretty cool. That's like cool." And he's like, "Yeah. You know, they told me that when I started 30 years ago. Here we are and we're still 30 years away." That was always the joke in fusion was that it was 30 years away.

    Cody Simms (03:23):

    Hopefully we're getting closer.

    Chase Lochmiller  (03:23):

    It feels a lot closer today.

    Cody Simms (03:23):

    Hopefully we're getting closer, right?

    Chase Lochmiller  (03:27):

    Exactly, exactly. When I realized I wanted to do something faster paced, I felt like in certain ways I had lost a purpose. I was like, oh man, what am I going to do? I had my whole life set on being a theoretical physicist or [inaudible 00:03:40].

    Cody Simms (03:40):

    Was entrepreneurship a thing that was around you as a kid? Was there any influence there for you?

    Chase Lochmiller  (03:46):

    I mean, my dad was kind of an entrepreneur in certain ways. [inaudible 00:03:49].

    Cody Simms (03:49):

    I rode the bus with your dad going to the Redwood site in Reno and it was amazing. He's a character.

    Chase Lochmiller  (03:56):

    Yeah, he's a total character. He's a one-of-a-kind. He's certainly had a big influence on my life. Entrepreneur in a very different sense. And what's funny actually, I think there's this aspect where sometimes if you have a parent that has a lot of success in one vertical, you tend to be like, look, I want to make my own path, my own sense of success in something completely different.

    (04:16):

    And I felt like I was doing that for many years, and building things in technology, and I've worked in quant finance for a long time. And just doing things that my dad never would've done. It was just never his career path. He's been building in real estate. He's a multifamily and commercial real estate landlord owner operator.

    (04:32):

    The funny part is this whole big boom in AI and all these big AI data centers that are being built, it feels like I'm coming full circle. Where it's like, okay, all of these things that I've learned in technology that have inspired how we're building out all of this AI infrastructure to power intelligence and the future economy, and yet I own a building, and I'm a landlord just like my dad.

    Cody Simms (04:53):

    Just like your dad. It's amazing. And you personally have done some pretty crazy stuff too, like back to your Colorado roots, your mountain climbing adventures. Maybe share a little bit about that and what inspired you there.

    Chase Lochmiller  (05:05):

    Sure. In middle school, I had this teacher, and her husband was this blind guy named Erik Weihenmayer. He was attempting to climb the Seven Summits, so the highest peak on each continent, and he was blind. And at the time, I was in seventh grade, and he was training to climb Mount Everest. David Breashears' Everest documentary had come out. Into Thin Air, the Jon Krakauer book, had come out just about the account of the disaster that had happened.

    (05:30):

    And I was like, man, that's so cool. It's the first time I'd heard of this concept of the Seven Summits. We have a lot of 14,000 foot peaks in Colorado. It's like a very Colorado thing to talk about climbing 14-ers. But I think I was just kind of inspired to climb big mountains. And spend a lot of time outside and grew up going skiing a lot and spending a lot of time outdoors.

    (05:50):

    So after I graduated college and I stopped playing sports very competitively, I was like, all right, what else can I do? And I became sort of an alpine mountaineering kind hobbyist, for lack of a better word. And so I did a bunch of different expeditions sort of all over the world. At this point, I've climbed five of the Seven Summits, including Mount Everest. I had a couple of different expeditions to Mount Everest, once in 2014, and there was a very large avalanche that happened that year in the Khumbu Icefall, tragically 16 Sherpas were killed in that accident. But for some reason, I felt compelled to go back, and ended up summiting in 2018. So had a very successful summit that I learned a lot from sort of the first time. Maybe eventually I'll get to the final two summits, but it's just kind of-

    Cody Simms (06:31):

    You're a little busy right now?

    Chase Lochmiller  (06:33):

    ... been de-prioritized.

    Cody Simms (06:34):

    Any of them that really stood out as the hardest challenge of any of the climbs you did?

    Chase Lochmiller  (06:39):

    Everest was this challenge because it was this crazy expedition, and there was so much preparation that went into it, and there's so much emotion as a result of just shifts and things. And it's just kind of like the world is so focused on what's happening in Everest. There's media coverage for every little thing that's going on in Mount Everest every climbing season.

    (06:59):

    So I think coming out of that, we actually, when we started Crusoe, one of our core company values that we created is to think like a mountaineer. Cully spent a lot of time mountaineering as well. Actually, we were climbing a mountain when we were ideating on the origins of Crusoe. The sense there is really about trying to have a lot of the same principles that you have in mountaineering applied to your life and your work at Crusoe. And it's not like everybody has to go climb mountains and go be a big alpine enthusiast.

    (07:26):

    It's more about the sense of incredible amount of preparation. Thinking through, okay, here's my plan in terms of how I'm going to get to the summit, and then I'm going to get down safely. And here are the different things that could go wrong. And if those things go wrong, here's my plan for them. Here's my plan B. Here's my plan C. Here's what I'm going to do if the weather changes. Here's what I'm going to do if I have this piece of gear fail. Here's what I'm going to do if my partner gets sick. Here's what I'm going to do if-

    Cody Simms (07:49):

    Hope for the best, plan for the worst, right?

    Chase Lochmiller  (07:50):

    Exactly. Exactly. And there's this notion of mountaineering that getting up is optional, getting down is mandatory. And we actually try to instill a lot of the core safety practices in mountaineering, just really thinking through safety as a critical aspect. And in our business where we're dealing with both hardware and software, there's millions of man-hours that have been worked in Abilene. We have to be thoughtful about safety culture so that we can build these things at scale at speed. But also, [inaudible 00:08:16].

    Cody Simms (08:16):

    I was going to say, how do you do it and still be the fast cowboys of the space?

    Chase Lochmiller  (08:20):

    There's an element of, mountaineering is actually an interesting thing to study. You have these legacy styles of mountaineering that are these big expeditions where you bring everything. You almost build a whole city, and you are very, very intensely prepared. And over time, what people found to be the most successful ways of climbing big peaks and breaking records is this alpine style of light and fast. Where you're basically moving quickly, you have everything you need and nothing more. And you have this robustness in your planning and you're thinking about things safely. But oftentimes, speed is actually a mechanism to be safe in mountains, especially when you're-

    Cody Simms (08:57):

    What is it? Fast is slow. Slow is fast, right? Is that a-

    Chase Lochmiller  (09:00):

    Yeah.

    Cody Simms (09:01):

    Maybe let's fast forward to today and where Crusoe is. Crusoe's had, from where I sit, one of the most unbelievable evolution stories. And for folks who really want to dive into that, we recorded an episode with Chase's co-founder, Cully, a couple months ago in Austin. And we really dive into the path that the company has taken to get to where you are today. But with you, I really want to focus on where are you today and where are we going? So maybe describe what Crusoe is now, and the business that you are running today.

    Chase Lochmiller  (09:32):

    So Crusoe is a vertically integrated AI infrastructure business. Our goal is to help make energy and intelligence more abundant, just really accelerate this abundance of both energy and intelligence, which we think are going to be critical to uplifting humanity in the future and driving progress for everybody around the world. Now, how we got here is pretty interesting. I think we have a culture of curiosity. We just sort of are constantly asking questions about how we would do X.

    Cody Simms (09:59):

    And just quickly, sense of scale? What, 800-ish employees?

    Chase Lochmiller  (10:03):

    We actually just celebrated surpassing 1,000.

    Cody Simms (10:05):

    1,000 employees. Amazing.

    Chase Lochmiller  (10:07):

    1,000 full-time-

    Cody Simms (10:07):

    Congratulations.

    Chase Lochmiller  (10:07):

    ... employees. And then we have numerous sites where we have more than 1,000 contractors. In Abilene, we have 7,000 contractors that are working there every day. 7,000 people on site every day. But 1,000 full-time employees across a couple of offices.

    Cody Simms (10:23):

    So you were saying this culture of curiosity.

    Chase Lochmiller  (10:24):

    We've taken this very vertically integrated approach to AI infrastructure. So what that means is we're doing the hardware aspects of building the infrastructure needed to power AI needed to run these large-scale intelligent workloads. So this is everything from land and power development, data center design, construction, cooling, mechanical, electrical. The whole stack of the physical infrastructure to stand up AI workloads.

    (10:47):

    And then we're also in the business of deploying these large clusters of GPUs. This is our Crusoe Cloud business where we build, operate, and manage these large-scale clusters of GPUs, and integrate them with critical services like storage and high-performance networking. And then we also have these managed services to extract intelligence from that infrastructure. So things like managed inference product, managed auto clusters product. Things that basically help people create intelligent results from these massive investments that they're making in infrastructure.

    (11:19):

    We cover a lot of stuff from hardware to software. I like to talk about our businesses creating these AI factories, these factories that produce intelligence, and doing the hardware of building the AI factories, and then actually the software of actually how you utilize the AI factory to produce intelligence. And the reason we believe that it's the right moment to build a vertically integrated business is what we're seeing unfold is actually this massive new category of infrastructure. And whenever you have this paradigm shift, being able to move quickly, being vertically integrated, enables you to move quickly into sort of a new vertical.

    (11:56):

    And this infrastructure for AI, what I call the infrastructure of intelligence, it's dramatically different than anything that we've ever seen before. It's different than traditional data center infrastructure that serves the internet and traditional cloud computing. It's different than traditional power systems that have served us scaling to date. And a lot of that stems from the actual workload, the design of the systems ranging from the way the clusters are built out, the way they're utilized, and the way users interface with the services. It just merits a completely different design, and it's at a massive scale. So it sort of warrants this investment in vertical integration.

    Cody Simms (12:34):

    Can you describe a little bit, just for folks who maybe aren't as familiar with AI infrastructure, the difference between training workloads and inference workloads? And are those setups substantially different for you both on the infrastructure side and on the cloud business side?

    Chase Lochmiller  (12:48):

    We are seeing a convergence, and I'll get to that in a second. So training is basically where you're setting up the model that you're going to be utilizing. It's basically like you're making the investment and you're amassing all of this data, all this training data, this historical data, into your data center, and then you're setting up this large scale model. It's going to be like tens of billions, hundreds of billions, trillion parameter large language model or other foundational model, depending on your application. And then the training process is basically trying to fit a large scale non-linear statistical model to that data so that it would be able to, with new inputs, be able to produce results that are intelligent, so to speak, and sort of learn from that data. And so-

    Cody Simms (13:29):

    And then most of the big giant projects we're seeing today are building these training centers, for the most part?

    Chase Lochmiller  (13:35):

    And this is where I was saying it's changing and it's converging. So there's this notion of pre-training, which is basically trying to create structure from all of this unstructured data. And that's historically been producing these original GPT models, so GPT-1, GPT-2, GPT-3. Where we're seeing massive compute scaling is actually in the post-training and sort of test time compute scaling. Where you're actually having, you have this foundational model that has been pre-trained, and then when you're serving a workload, OpenAI sort of released this o1 model about a year ago, and there's since been a lot of innovation on these different chain of thought reasoning models that actually produce results, and then take those results and feed them back into the model. And what it ends up doing is it's a bit like training on new data, and it's sort of training at inference time. It's called test time compute scaling. And it's where you're basically running a lot of passes through your neural network when you're actually trying to produce an inference result.

    Cody Simms (14:34):

    Could that in theory start happening at the edge more then?

    Chase Lochmiller  (14:37):

    It can, but it also needs a ton of compute. And the benefit of doing it at the edge is you actually have lower latency to the end consumer. If you're actually doing a significant amount of test time compute scaling, you're actually thinking about this for a long period of time. So if you're thinking about it for even a second, oftentimes you're thinking about it for many seconds or minutes, or if you've tried deep research or some of these other things, that they just require a lot of time to produce a result because they're thinking, they're thinking about what the result's going to be.

    Cody Simms (15:07):

    So they're training themselves in real time as they're inferring their answers.

    Chase Lochmiller  (15:11):

    Exactly. But if you're at the edge, it actually doesn't matter. It doesn't benefit you. So you might as well be in these big centralized locations. Now that's for one school of, not one school, but one category of different results. There are definitely things that can be done very quickly at the edge, and we sort of believe in this bimodal distribution of infrastructure that's going to require both these massive scale centralized facilities like what we're doing in Abilene and some of these other campuses that we're building, as well as widely dispersed inference infrastructure everywhere in the world.

    Cody Simms (15:45):

    I think of inference as basically being the edge serving layer of the queries. Is that a very basic way of thinking about it?

    Chase Lochmiller  (15:51):

    I think one way of thinking about a large language model is it's kind of like a database. It's like a statistical database in certain ways. Where it's like instead of looking up a row and a column in a database, you're actually looking up all of the information we've mapped to in a row that may not exist. So you're sort of statistically inferring in that database. So in a lot of ways, inference is looking up something in a database, but it's a statistical database.

    Cody Simms (16:16):

    So maybe just for clarifying what you have today, describe the Lancium project now, and where it sits and ultimately what that is going to be.

    Chase Lochmiller  (16:24):

    We've always sort of taken this energy first approach to developing compute infrastructure, from when we had a Bitcoin mining business to the early days of scaling our AI infrastructure business, to today where we're building and operating gigawatts of capacity to support these large scale global workloads.

    (16:40):

    A bit of the history of Lancium is that I've known the CEO of Lancium, this guy, Michael McNamara, for five, six years. He initially started building this project as a Bitcoin mining site. And it's funny to track the arc of Bitcoin mining infrastructure and compare that to the arc of AI infrastructure because I think they're following kind of similar paths. They're going to end up in different places, but it is an interesting comparison to make.

    (17:04):

    But Bitcoin has been notoriously famous for consuming a tremendous amount of energy, which led a lot of Bitcoin miners to going to areas where they could access abundant low cost energy. So Michael had started working through the Abilene site partially because there was an abundant amount of energy there. And when he started telling me about it, it was something that had always been on my radar as very interesting market to pursue to access large scale clean energy solutions.

    (17:32):

    And what had happened there is, on the back of production tax credits, a lot of wind developers had built out these large scale wind projects in that region. It's a naturally very windy region. So you get very, very high capacity factors for a lot of the wind farms that are in that region. Now, there didn't exist the transmission infrastructure or the load to basically consume all of the power gen that's been installed there. And so what's resulted is power prices are frequently negatively priced. There's just too much power being generated. And people are basically getting these incentives, these production tax credits to the point where they're willing to sell power at a negative price because the production tax credit is bigger than the negative price they're paying for power. Once the production tax credit incentives roll off after 10 years, they're a merchant, and they're in a position where they're actually having to curtail. So they're basically, they could be producing power, but they're not because there's no marginal buyer for the power.

    Cody Simms (18:24):

    So when you see the windmills out in the field just standing still.

    Chase Lochmiller  (18:26):

    Yeah. And you're like, wow, it's windy today, why isn't this thing spinning? It's because it's-

    Cody Simms (18:30):

    There's no one on the other end.

    Chase Lochmiller  (18:31):

    ... [inaudible 00:18:32] curtailment. Exactly. And this seemed like an incredible opportunity for us, the early, early end of this energy hungry demand coming from AI. And due to the fact that AI, even on the inference side, is more tolerant of latency than traditional web applications, we could really build it anywhere. That led us to Abilene as this great market that had low cost, abundant clean energy. And today we're building a 1.2 gigawatt campus. And that's supporting some of the largest, most important workloads in the world for Oracle and OpenAI.

    (19:03):

    And we've been energizing that incrementally, and been standing it up at sort of a record pace. We first broke ground on the facility in June of 2024. Quick story. There was originally like an RFP that went out for the site. And the fastest anybody had committed to build 100 megawatts of capacity was two and a half years. I was talking to someone, and they were like, "Could you do this in 12 months?" And I was like, "Yeah, for sure. Could definitely do that." Having done no research on it. But was thinking about it with my team and we brought together a-

    Cody Simms (19:33):

    Going up is optional, going down is required, right? There we go.

    Chase Lochmiller  (19:37):

    Exactly. But we ended up delivering the first 200 megawatt buildings in 11 months. And this comes back to the sense of curiosity that we built at Crusoe where we just challenged every single way of traditionally building data centers. And said, wait, why do we have to do that? Why is it designed this way? Why aren't we designing it this way? It's a giant cluster of GPUs. It should be designed this way. Why do we have to power it that way? Why can't we energize it this way?

    (19:58):

    And just asking those questions of why, why, why, why, why? And then when you get to the answers like, oh, well, that's sort of the way the industry's always done it, you're like, oh, okay, well that's stupid. Let's do it this other way. This other way makes more sense. And we were able to save an incredible amount of time and really able to accelerate this infrastructure at a record pace in Abilene.

    Cody Simms (20:16):

    Not to not dwell on the size and scale and amazingness of that particular project, because it's been relatively all consuming for you all for a while now, but what's next? Seems like there are many more of these being contemplated or signed with ink and starting to be developed.

    Chase Lochmiller  (20:33):

    We have a couple of other projects that were under development, some that we can talk about more publicly, others that we're sort of just less public about. They all have a similar philosophy though, which is taking this energy first approach to developing the compute infrastructure. Going to places that we know we can access this low cost abundant energy.

    (20:52):

    So there's a couple of projects we're working directly with IPPs where we're actually a behind the meter off take for them. In the case of like a wind farm, where we actually have a behind the meter power purchasing agreement with the on-site wind generation, we'll actually use the same interconnection, the same substation that they're using to sell power into the grid. We use that to buy power from them, as well as buy power from the grid when the wind's not blowing and the sun's not shining.

    (21:17):

    Our perspective in all this is that if you look at energy infrastructure today, if you look at data center infrastructure today, they're largely saturated in terms of production and demand. And if you look at the ambitions of AI, and you look at the ambitions of what people want to build and how they want to scale it and how they want to utilize it, it's going to require just all net new infrastructure. And as a result, that requires us to think about how do we actually bring online the new power infrastructure, both on the generation storage distribution side, and then how do we bring online the net new data center infrastructure to support these mega clusters of advanced AI accelerators and GPUs?

    (21:56):

    We have a campus where we're planning to build wind, solar, batteries, gas, and have a grid interconnection. And all of those things combined are actually more generation capacity than what we're using in the data center on a day-to-day basis, but we have to sort of engineer it for overall peak demand. This is where I get back to this notion of creating this abundance of energy. We can create an abundance of energy by virtue of the fact that we need to almost oversize our infrastructure demand, and that extra can lead to lower cost power for everyone.

    Yin (22:27):

    Hey everyone. I'm Yin, a partner at MCJ, here to take a quick minute to tell you about the MCJ Collective Membership. Globally, startups are rewriting industries to be cleaner, more profitable, and more secure. And at MCJ, we recognize that a rapidly changing business landscape requires a workforce that can adapt. MCJ Collective is a vetted member network for tech and industry leaders who are building, working for, or advising on solutions that can address the transition of energy and industry.

    (22:58):

    MCJ Collective connects members with one another with MCJ's portfolio and our broader network. We do this through a powerful member hub, timely introductions, curated events, and a unique talent matchmaking system and opportunities to learn from peers and podcast guests. We started in 2019, and have grown to thousands of members globally. If you want to learn more, head over to mcj.vc, and click the membership tab at the top. Thanks and enjoy the rest of the show.

    Cody Simms (23:28):

    Do you see a world where, you were talking about behind the meter and off-grid, where the data center power demand, and this world may be decades from now, becomes the primary power consumption source in whether it's the United States or in the world, and you from a world where utilities are the gatekeepers of power to the data centers actually being the drivers of power production and distributing it?

    Chase Lochmiller  (23:53):

    Yeah. I mean the silver lining in all this is the people making these investments are the companies with the biggest balance sheets and the best balance sheets and greatest cash flows in the history of business. They're very well capitalized to make these investments. I think you're already seeing it today. I think Northern Virginia, I think data centers, I don't know, I heard some statistic, you'd have to fact check me on this, but I think it's like 40 some percent of power in that region is being consumed by data centers. Not unreasonable to think that will manifest in many other markets where data centers are going to be built to support large scale intelligent workloads. And I think it's very natural.

    Cody Simms (24:27):

    I mean, does it become at the beginning a shadow grid almost of this off-grid power sources, but eventually becomes the feeder to how everybody else accesses power? Do you think it's that macro?

    Chase Lochmiller  (24:38):

    Maybe there's one way of thinking about it like data centers can be co-located with large generation resources, and be at the, I don't know, the origin of the power production. It's like in order to bring all this infrastructure online, we need more power, so we're just going to do it. And then whatever excess is available, I think everybody's commercial, it's like how are we going to monetize that? How are we going to bring down the overall cost for the data center infrastructure? We're going to sell it in some downstream capacity. We're going to get it to some other market where we can do something useful with it. So I think that's definitely a trickle down effect of these huge investments that are being made in the infrastructure.

    (25:13):

    One other thing I would just point out is that, because of the scale of these projects, both in terms of megawatts, gigawatts, as well as capital investment and the types of capital being invested, I think it is one of the greatest opportunities ever for advancing next gen energy solutions, climate technology, new battery solutions, new energy production solutions.

    (25:36):

    Another project I'll talk about is, in Wyoming, we've tried to think very first principled and thoughtfully about our energy production, but we're looking at pioneering what would be I think the largest post combustion carbon capture and sequestration system in the world. It's a carbon hub that's been developed by our partner Tallgrass. We're planning on bringing a lot of gas infrastructure online to support initially a 1.8 gigawatt facility for AI workloads.

    (25:59):

    But one of the benefits is there are four Class VI wells that have already been permitted. And there's existing carbon infrastructure where you can actually capture the post-combustion CO2 and permanently sequester it underground, benefit from the incentives in 45Q. But I think a lot of these things have been discussed or done at sort of a pilot scale, but never at a massive scale. And part of it is because there hasn't necessarily been the capital invested in terms of making a reality. What AI infrastructure presents-

    Cody Simms (26:24):

    And it maybe hasn't penciled for power that's just flowing into the grid, but you have potentially energy buyers who are willing to pay a different price for fast access to power and fast access to clean power, right?

    Chase Lochmiller  (26:37):

    Exactly. And I think what a number of these customers, they're looking at this saying, well, okay, if I produce power from gas, that's great. I can get my power quickly. Our sustainability goals are over here, and if we do that, we're going to increase our carbon emissions. We're going to have to go buy carbon removal credits. What's the price of those? Okay, this is pretty expensive to buy that amount of carbon removal credits. What if we just paid for post-combustion carbon capture and sequestration and got the 45Q credits? Is that cheaper? And I think in a lot of cases, the answer is yes.

    Cody Simms (27:06):

    Talking about the speed at which all of this needs to get built, there are a lot of companies, not just infrastructure companies, but the biggest AI companies in the world who have now raised so much money at these skyscraper valuations, and there's going to be pressure for them to start delivering from a revenue perspective, both on their own valuations, but also they're ultimately the patron of these data centers that you're building out. And so to some extent, the economics there need to start penciling for them too, in terms of the cost and the amount of infrastructure they're building out to support their businesses. If that doesn't happen or happens more slowly than they're expecting or their investors are expecting, do things start to cascade quickly in terms of the public markets?

    Chase Lochmiller  (27:48):

    Definitely a possibility. I don't want to rule anything out. But I think a lot of the big AI labs are seeing just incredible revenue growth and adoption in these services that are being built out. Kind of the optimist in me believes that we are going to hit these growth trajectories because the models are getting so good. If anybody hasn't played with them, I encourage them to use a lot of these new services that are being built. I mean, they're incredible what they can accomplish. And if you compare it to what you would be paying a software engineer, or something like a call center, what you're spending on a lot of these human labor aspects, it's a massive cost saver. So that's like the optimist lens in all this.

    (28:25):

    If people don't hit their growth targets on revenue, and how are we going to pay for all this infrastructure. In any major bubble, there's always kind of like what leads to the unwind or what leads to the crash? Oftentimes it is unresponsible amounts of leverage. And we are seeing signs of some of that today. It's not like everywhere, and not all leverage is bad. But we are seeing a lot of debt capital pouring into the space because how else are you going to pay-

    Cody Simms (28:54):

    Pay for it.

    Chase Lochmiller  (28:54):

    ... for a lot of this? When you're talking about spending hundreds of billions or trillions of dollars to build out this infrastructure. This is something that we are very focused on at Crusoe in terms of understanding counterparty risk and understanding what are the real risks we're taking and real downside scenario. We try to think through really bad things happening. That's part of thinking like a mountaineer, right? Thinking about the robustness in the business.

    (29:15):

    I think a lot of the bigger, higher leveraged situations that we're undertaking, we typically have investment grade counterparty risk for very long duration. Even in an economic downturn, or even if AI isn't turn out to be everything that it's cracked up to be, is Oracle still going to be around? I believe they will be. The financial markets still believe they will be. They have a great enterprise software business and database business. Still a great moneymaker for them. Same could be said for Microsoft, for Amazon, for Google, for Meta, for all these companies that are still producing incredible free cash flows, independent of anything that's happening in AI. So we try to underwrite the risk through that lens.

    Cody Simms (29:53):

    Shifting a little bit into where you see AI going, let's start with the energy markets, and then go bigger than that. How far away are we from AI as an actual power orchestrator in terms of controlling the knowing when to buy curtailed wind and powering a data center with it, or not? Knowing if a data center doesn't have to have 99.999% uptime, when it shouldn't need to be up, and what you can do with that excess power in terms of sending it back to the grid, for example? How far away is that from reality?

    Chase Lochmiller  (30:24):

    Depending on who you ask, both closer and further than one might think. The tricky part in adoption here is trying to get all of the players to coordinate on solutions. We sort of know the answer to a lot of these solutions. I think we have the technology to do it. I mean, there's great things happening across the space. You look at companies like Emerald AI that's trying to build a demand response essentially for large-scale data center loads, and working with the utilities to help curtail during some of the most challenging peak hours. You look at the infrastructure, and how we build smart infrastructure that can help with the transmission and distribution of power. I look at a company like Heron Power, started by Drew Baglino, that's reinventing a lot of the critical electrical stack from high voltage to low voltage, and doing that with power electronics.

    (31:10):

    There's a lot of stagnation in the space where not much innovation has been done for a century plus. There's a lot of interesting new ways that we can use power electronics to actually do this smarter, faster, cheaper. I think we have the tools, and I think also thinking through things like two-way power systems. Where people have generation, people have load, and people have storage in their home. You have cars driving around everywhere that have giant batteries on them. They can be these giant sources for power generation in moments of peak demand. But it requires this massive orchestration effort and it requires buy-in from utilities and a lot of the other counterparties throughout the entire ecosystem to overall make it happen. So I think that's the bottleneck to adoption is really getting utility and regulatory buy-in, not like-

    Cody Simms (31:58):

    So it sounds like it's much more-

    Chase Lochmiller  (32:00):

    ... do we know how to do it?

    Cody Simms (32:00):

    Yeah. And it sounds like it's much likely to be more of a gradual move toward it until we're ready for all at once as opposed to, oh, the software's ready, so it's an all at once switch.

    Chase Lochmiller  (32:10):

    But I mean, you're seeing signs of progress here. I mean, looking at things like VPPA adoption with Tesla Powerwalls, they've built out this whole infrastructure of distributed Powerwalls in people's homes. And I mean it's been helpful in terms of serving peak events here in California. You wouldn't think of California when you think fast, nimble government agencies. But you're seeing similar things that Base Power's doing in Texas. Love all the work that they're doing in terms of trying to create both distributed storage, distributed generation, distributed mechanisms to basically support grid resilience and utility.

    Cody Simms (32:44):

    If we take a big step back, and everything that Crusoe is building and helping to enable, this is more than just a technology revolution. This is ultimately societal, cultural, as we think about a world where we actually are moving toward artificial general intelligence or like, your business is very much at the forefront of enabling that to happen. And so I'm sure you've thought a lot about what does the world look like in 20 years, 30 years? What does the job market look like? What does misinformation, trust, media look like? What kind of life are kids going to have in 20 years? How are you thinking about the future of AI, both the pros and the potential negatives of it as it starts to become pervasive in our lives?

    Chase Lochmiller  (33:31):

    Maybe I'll talk about my perspective on a lot of the growth and impact of AI. And then I'm a dad too, so I have three young kids, and how I'm thinking about setting them up for success for this next chapter of abundant intelligence.

    (33:43):

    I think number one is people call things like AGI, ASI, artificial super intelligence, there's all these acronyms. And ultimately the way I view it is like, okay, can we do something with a computer that exceeds or has some cost optimization compared to what we would do with a human? And in a lot of ways, I think about that as digital labor. And so the key thing that I sort of think about is this boom going to have real legs in sort of transforming human prosperity and the human experiences, is this going to be good for people, is really around is this going to lead to significant GDP growth per capita? That's the main thing that I think about.

    (34:24):

    And if you go back to your macroeconomic 101 class, the growth in GDP, there's three main things that lead to the growth in GDP. It's change in capital in the economy, all sorts of capital, change in labor in the economy, and then change in technology. Those are the three main things that cause increases in GDP. Historically, technology's been a hard thing to measure. So it's kind of like you measure these other two things, and then even if it's technologically unobserved, you sort of measure it as the difference.

    (34:52):

    But what's fascinating about this boom is that, with AI, it's created a new way for us to create more labor. That change in labor can be way bigger than it's ever been in the history of humanity, because we can make digital labor, we can make labor in silicon, and that's going to lead to incredible outcomes. And so my belief is that this is going to lead to an era of increased leisure time for people. People are going to be able to do far more with less than they've ever been stretched to do. And for those that are incredibly high agency, people are going to be able to do more with their fingertips. You're going to have access to a workforce of thousands of people if you have the agency to just go engage with it and be curious about how you actually use this technology to will your dreams into existence.

    Cody Simms (35:40):

    How does it impact those who don't have as much access to it? Do you think they get more left behind in that economy?

    Chase Lochmiller  (35:45):

    I think it's the greatest equalizer. It's like how easy is it to access, if you have a phone, if you have an access to an internet connection, the cost of-

    Cody Simms (35:53):

    I guess just like anybody can become a star on YouTube today from anywhere in the world, right?

    Chase Lochmiller  (35:58):

    Yeah. It's like the cost of intelligence is going to converge to the cost of energy. It's like the cost to compute is going to come down over the course of time, and that compute is the equivalent of intelligence. And people are going to be able to access these things very freely, not just in developing world economies, but all over the world. I think having that access at your fingertips, certainly there's going to be places that are going to adopt it faster, but I think it's the greatest equalizer ever in the history. You don't need to have an incredible teacher that happens to be born in your town to educate you on some specific topic or inspire you to do something. It's like literally the AI tutoring can be unbelievably effective. I mean, you're seeing this with [inaudible 00:36:41].

    Cody Simms (36:40):

    I mean, it also means anyone in the world could become good at driving deep fakes and videos and propaganda and controlling the narrative too. And so do you think an infrastructure company like Crusoe should have some say in what are the guardrails we put around AI? How does that-

    Chase Lochmiller  (36:56):

    We don't really view our role as being the hall monitors of what should happen in the space. We really view it as more intelligence in the hands of more people, we think it's just going to benefit humanity. And we're really just rooting for the whole space. We're not trying to say you should do this, you shouldn't do this. We think most of the applications that we're seeing are just generally positive things for people. I know there's been some criticisms around some of the AI-generated video content. It's like, okay, this seems crazy.

    Cody Simms (37:26):

    It's insane. Has anyone played with Sora too? It's insane. It's amazing.

    Chase Lochmiller  (37:31):

    It is amazing. From my perspective, I think it's going to completely transform storytelling in terms of what it would take to build a feature length film, just from a budget standpoint, from a creativity standpoint. You're going to end up with more ideas out there that get built because the barrier to entry for those big ideas is much smaller. I'm very optimistic about the overall video generation content platform. My own kids, I'm really trying to hone in on developing a sense of high agency and high autonomy, high curiosity. We send our kids to a Montessori school, so it's very much creating this sense of independent thought. And the reason for that is, I sort of talked about this earlier, just in terms of in a future where every person has access to 100,000 person workforce at their fingertips, that's great if you have the agency to-

    Cody Simms (38:22):

    Can we all be generalists? Can the world work if we're all generalists, do you think?

    Chase Lochmiller  (38:26):

    You're a general of your own domain. And I think there's going to be different layers to this and different ways that humans are curious in terms of using that to their benefit. And it creates unbelievable opportunity for people to build incredible things that benefit others.

    Cody Simms (38:41):

    I think we're probably wrapping up here. I guess the last question I would ask is how do you want Crusoe to be remembered 30 years from now? You are absolutely at the core of building out this wave of massive change that is hitting humanity right now. Whether humanity wants it or not, it is inevitable. And how do you want your company and yourself to be remembered as part of being there for it?

    Chase Lochmiller  (39:05):

    What I'd like Crusoe to be remembered for is really being one of the platforms that help orchestrate intelligence for the economy, and really sort of standing up the infrastructure, the energy solutions, the data center solutions, the computing solutions that really enabled intelligence to scale.

    Cody Simms (39:22):

    Chase, thank you so much. Amazing.

    Chase Lochmiller  (39:24):

    Thank you.

    Cody Simms (39:24):

    Appreciate you.

    Chase Lochmiller  (39:25):

    Thanks for the time.

    Cody Simms (39:31):

    Inevitable is an MCJ podcast. At MCJ, we back founders driving the transition of energy and industry, and solving the inevitable impacts of climate change. If you'd like to learn more about MCJ, visit us at mcj.vc, and subscribe to our weekly newsletter at newsletter.mcj.vc. Thanks and see you next episode.

Next
Next

AI’s Power Gap and Nuclear’s Return with The Nuclear Company