Hosts
About the episode
This year, American tech companies will spend $300 billion to $400 billion on artificial intelligence, which is in nominal dollars more than any group of companies have ever spent to do anything. Notably, these companies are not remotely close to earning $400 billion on artificial intelligence.
That’s why you’re starting to hear some people wonder whether the AI build-out is turning into the mother of all economic bubbles.
The prospect of an AI bubble should scare us. Roughly half of last quarter’s GDP growth came from infrastructure spending on AI, and more than half of stock market appreciation in the last few years has come from companies associated with AI. If the AI spending project blows up in the next few years, as our next guest says it might, the implications for technology, the economy, and politics would be immense.
Paul Kedrosky is an investor and writer. Today we talk about the AI capex boom: how it works, who’s financing it, how its financing works. We put the AI build-out in historical context. And then we spend a great deal of time walking through what could go wrong and when it might go wrong.
If you have questions, observations, or ideas for future episodes, email us at PlainEnglish@Spotify.com.
In the following excerpt, Derek and Paul Kedrosky break down the huge amount of money the U.S. is currently investing in artificial intelligence and how that money is being divided.
Derek Thompson: Before we start, who are you? What do you do?
Paul Kedrosky: Yeah, that’s a good question. So I have a couple of day jobs. One day job is I’m a partner with a venture capital firm called SK Ventures, where we’re mostly doing early-stage investing, which is to say high failure rate, low capital, most things break. And then I also sit in as a fellow at the MIT Center for the Digital Economy. So this is sort of closer to the spirit of some of the things we’re working on. And then I also have a newsletter that goes out to a bunch of hedge funds and generally to hedge funds and buy-side firms and things like that. Just because my background way back when I was on the sell side, I worked for a brokerage firm, and I’ve just never been able to shake that. So I can’t help myself. Sometimes I want to give them advice whether they like it or not, and so I still do a lot of work with a bunch of hedge funds and buy-side firms, which takes us back to data centers and AI and blah, blah, blah.
Thompson: Well, you should know your newsletter doesn’t just go out to hedge funds. It also goes out to podcast hosts, which is one reason—
Kedrosky: I’ve heard.
Thompson: —why you’re on this show.
Kedrosky: I’ve heard. Yes.
Thompson: One thing I find so interesting about your analysis is that artificial intelligence is sometimes talked about as being the technology of the future. And I’m trying to ring the bell very loudly that AI is the most important economic phenomenon of the present. It is here, it’s happening right now, and you’ve been sounding the alarm maybe more than just about anybody, or more effectively than just about anybody, on just how massive U.S. investment in artificial intelligence is by historical standards. So why don’t you just start with your thesis statement? How big is this?
Kedrosky: So yeah, maybe can I go back and tell a quick backstory here first? Just because what got me interested was what you’re describing, which is there’s a huge amount of money being deployed. It’s going to a very narrow set of recipients, some of these chip firms and others, and it’s going to some really small geographies like Northern Virginia. So it’s an incredibly concentrated pool of capital, and yet it’s so large that when you do the aggregating and do the math, it seems to be large enough to affect GDP. So I was saying, “OK, fine. This is crazy. I should do the math.” So I did the math, and I found out that in the first half of this year, the data center–related spending— so spending on these giant buildings full of GPUs and racks and servers and what have you that are then used in turn by the large AI firms to generate responses and train models—that probably accounted for something like half of GDP growth in the first half of the year, which was absolutely bananas.
And I was like, I did the math four or five different ways, trying to prove myself wrong. And then I said, “OK, fine. This feels like something I should mention.” And so I said it, and I think it’s a startling figure for a whole bunch of reasons, one of which you alluded to, which is that even compared to historical spending, whether you pick the telecom bubble or railroads or whatever else, and we can dive into those, it’s unprecedented. It’s also unprecedented because of the nature of the spending, which I think is incredibly important because railroads are very different from GPUs, not just in the trivial sense, but in some very deep and important ways. And all of this gets missed, but the upshot is spending is huge, it’s driving the economy, people are very confused about this, and as a result, you end up making bad policy decisions because you think policy decision A is driving the economy when it’s this wacky stuff over here on the left.
Thompson: So we’re talking about an infrastructure boom that is on par with the broadband build-out of the 1990s, early 2000s, still behind, it seems like, the railroad boom of the 19th century. But we’re talking essentially about an amount of spending on one emerging technology that is without precedent in at least 60 to 100 years. How does AI capex break down? We’re talking about capital expenditures, so money that’s being spent on essentially machines rather than people. How much of this is chips versus energy versus building the actual data centers themselves? Is there a good way to think about where all this money is going?
Kedrosky: So a little more than half the cost of a data center is the chips that are going in. So say 60 percent. It varies depending on the model of the data center because there’s a whole bunch of different styles of data centers, if you will. There are some that are built almost on spec. Think of companies like CoreWeave, where they’re buying it, and it’s almost like they’re hoping to tenant a building. Think about it as commercial real estate, and I’m hoping to get people to move in. I’m building a shell, and people are moving in, and I’m hoping to get tenants, and then the tenants pay rent, right? So think of it in those terms. And then there’s the Metas and the Googles and the Amazons, where they’re using a huge amount of what they’re building, which again, roughly 50 to 60 percent of it is the GPU cost.
The rest is a combination of cooling and energy. And then a relatively small component is the actual construction. So think about the frame of the building, the concrete pad, and purchasing the real estate. So you can break it out that way. So it depends a little bit on what you’re planning to use it for. If I’m trying to build something that’s for training, well, I’m going to buy more expensive GPUs, right? I need the latest products from NVIDIA. If I’m building something that’s more for inference, meaning that I’m just going to [be] using it largely for people that are trying to generate responses, I’m hoping, well, then I don’t need the latest GPUs, and I can cut costs and cut some corners in there. So you can think about it as that continuing.
This excerpt has been edited and condensed.
Host: Derek Thompson
Guest: Paul Kedrosky
Producer: Devon Baroldi
More on AI