It’s easy to interpret CoreWeave’s lackluster IPO and muted first day of trading on Friday as bad news for the entire AI boom. But, as I’ll explain in a moment, that’s likely a mistake: many of CoreWeave’s problems are unique to CoreWeave.
First, you can see why some people might see CoreWeave as a proxy for AI as a whole. The New Jersey-based company’s entire business is running data centers full of the specialized Nvidia chips that customers need to run their AI workloads. CoreWeave is the first new “pure play” AI company to go public since OpenAI’s ChatGPT chatbot debuted in 2022, sparking the AI boom. Nvidia’s been public for decades, and arguably its valuation was bid to such lofty heights late last year that it had nowhere to go but down (Nvidia’s shares are down 20% year to date, but remain about 20% higher than they were at this time in 2024.) Other AI pure plays, such as OpenAI and Anthropic, are still private, while the Big Tech titans that have seen a big boost from AI have much more diversified revenue streams. So, if investors can’t get enthusiastic about a company like CoreWeave, then maybe the whole AI sector is in trouble. For those bearish on the technology, it seems like an open-and-shut case.
And CoreWeave certainly didn’t have the IPO it had initially hoped for. The company first talked about raising about $4 billion in equity at a selling point that would value the company at $35 billion. But due to lackluster investor interest, it wound up scaling that back to a $1.5 billion equity offering that valued the company at about $19 billion. In its first day of trading, the stock initially dropped from its IPO price of $40 per share, before recovering and bouncing around just above the IPO price. It was definitely not the blockbuster stock market debut that one might have expected for a company selling a key infrastructure component for tech’s newest new thing.
A business model built on debt
Yet CoreWeave is also a bad gauge for the AI boom more broadly because the company itself has so many problems. Chief among them is its business model, which is based largely around using debt to build data center capacity, well ahead of actual demand. The company has borrowed $8 billion to lease data centers and equip them with Nvidia graphics processing units (GPUs), the expensive computer chips that many developers favor for training and running AI applications. That debt carries high interest payments—most analysts estimate that it will need to make at least $1 billion of debt service payments in 2025, but that this figure could, depending on what happens with core interest rates and several aspects of CoreWeave’s business, be as high as $1.7 billion. That would wipe out between two-thirds and all of the money the company just raised in its IPO. And CoreWeave’s debt is likely to increase. The company still has about $4.4 billion in debt that it has secured but not yet drawn down.
CoreWeave has also been a pioneer in debt instruments for which the collateral is the GPUs the borrower plans to buy, or in some cases lease. The problem here is that GPUs are a rapidly depreciating asset. Nvidia is rolling out new, more capable chips each year. Most of the GPUs in CoreWeave’s current collection are Nvidia’s 2022 Hopper GPUs, which are being replaced by Nvidia’s newer Blackwell chips that offer significantly better performance. As a result, prices for renting time on Hoppers has fallen dramatically over the past year. That makes it harder for CoreWeave to earn revenue. More than that, though, CoreWeave’s debt instruments have covenants that mean the payments CoreWeave owes increase as its depreciation expense rises. CoreWeave has said it plans to depreciate its chips over a six year period—but some analysts think that a more honest depreciation schedule would see the value of those chips written off within half that time.
Long-term leases and a five-alarm fire cash burn rate
Wait, it gets worse. CoreWeave leases its data centers on long, fixed terms. The company has $2.6 billion in operating lease payments it must make, with the weighted average lease term currently nine years. The company had $213 million worth of lease payments due in the next 12 months at the end of 2024. The company makes enough money to cover those, but if its revenue doesn’t grow as anticipated, it could leave the company dangerously overextended, obligated to pay for data center capacity it can’t sell.
What’s more the company is burning through cash at a prodigious rate. Last year, it brought in $2.75 billion in cash from its business, but sent $8.7 billion out the door leasing data centers and buying and leasing GPUs. The company has also committed to almost $10 billion worth of additional spending on new data center projects in the U.S., U.K., and Europe. The only way it can afford those projects will be to take on more debt. But it turns out that one of its major existing debt instruments says it is not allowed to take on more debt before it repays the loan. To get around this, CoreWeave created a special purpose vehicle—a separate company that it controls—so that it can obtain more debt to meet its obligations for a deal with OpenAI that it announced earlier this month and that would see OpenAI spend $11.9 billion with CoreWeave over five years.
75% of CoreWeave’s revenue comes from Microsoft and Nvidia
And I haven’t even mentioned yet the extent to which CoreWeave is dependent on just two customers, Microsoft and Nvidia, which collectively accounted for more than 75% of its 2024 revenue. Or that Microsoft, which was about 60% of CoreWeave’s revenue in 2024, has already started pulling back from spending with the company, cancelling a number of contracts because of what Microsoft has said are service delivery issues. Or that Nvidia is not only a major customer of CoreWeave, but also its primary supplier of GPUs—oh, and an investor, owning 6% of its stock. That kind of ouroboros-like circulation of money often raises investors’ eyebrows, with Nvidia potentially propping up CoreWeave artificially.
All in all, you can see why investors might be wary of CoreWeave. But what does this have to do with the AI boom as a whole? Well, some analysts have looked at the revenue CoreWeave makes from customers aside from Microsoft and Nvidia, which amounted to about $440 million last year, and used it as an indictment of AI as a technology. Companies are finding today’s AI models too unreliable to adopt at scale, these critics say. If AI applications really were transforming business, critics say, CoreWeave’s revenues outside Microsoft and Nvidia ought to be a lot higher.
Business are adopting AI, but they don’t need to use CoreWeave
This argument misses some key points, however. CoreWeave rents raw GPU capacity. But it is not a normal cloud service provider. It doesn’t offer traditional central processing units (CPUs) that are the workhorses of almost all non-AI applications. And it doesn’t offer an extensive suite of cybersecurity, data security, and traditional application hosting services like all the big cloud providers do. Pretty much the only businesses that want raw GPU capacity without all of the other stuff are either cloud providers themselves, like Microsoft, or the companies building the most cutting-edge, general-purpose AI models, such as OpenAI, Anthropic, and Cohere. And even most of those AI companies already have deals with one of the major cloud providers to be their exclusive computing partner. Anthropic is partnered with AWS. Cohere is running on Oracle’s cloud.
A better tell of how the AI adoption is going is to look at the growth in the cloud revenues of these hyperscalers. Sure, such figures quickly get messy. The cloud providers often lump different types of services together in big buckets, making it very difficult for analysts to figure out exactly what portion of revenue growth is coming from AI applications. The pricing of some AI software also makes it difficult to determine whether the revenue bump is coming from widespread adoption—i.e. selling more licenses—or simply from the cloud providers jacking up the prices of existing enterprise software licenses. But the cloud revenue from Microsoft and Google Cloud both climbed 30% last year. AWS—which has not played as big a role in the AI boom so far—saw revenues increase about 19%. And executives at these companies have been attributing a good portion of that growth to enterprise adoption of AI. So far, there’s no reason to doubt them.
Meanwhile, while the hyperscalers might need spare capacity from a company like CoreWeave, they are all building out their own data centers at a prodigious rate, and increasingly stocking those data centers with AI chips of their own design—partly so they don’t have to be so dependent on Nvidia’s GPUs. Microsoft has pulled back on spending with CoreWeave, but says it is still on track to spend $80 billion expanding data center capacity this year worldwide. Estimates are that the Big Tech companies collectively plan to spend at least $300 billion on new data centers in the U.S. this year. None of this points to the idea that AI as a technology is failing.
The DeepSeek question weighs, but on vendors, not the tech
Now, that is not to say that all of that data center capacity will definitely be needed. One of the question marks that no doubt weighed on CoreWeave’s IPO and that has weighed on Nvidia’s stock too, is the DeepSeek factor. In January, the Chinese AI company shocked industry watchers when it showed that it is possible to build AI models that match some of the most cutting-edge “reasoning” capabilities of leading models from OpenAI and Anthropic, but do so in ways that don’t require nearly so much computing hardware. That might mean companies won’t need access to nearly as many GPUs. On the other hand, Nvidia CEO Jensen Huang has argued that these new “reasoning” models use a lot more computing capacity for inference, which is the term AI researchers use for running a model after it has been trained.
Another issue that is hanging over the entire AI industry is whether any of the companies building foundation models can ever be profitable. Creating cutting-edge models remains fiendishly expensive in terms of overall computing cost and the salaries of AI researchers, and yet the model capabilities are being duplicated by many competitors within weeks of debuting. This commoditization has driven prices per token, the basic unit of data an AI model handles, down dramatically. That in turn makes it difficult for the AI model companies to earn a profit. Increasingly, it looks like the companies that have more defensible business models and pricing power are those that are building applications for specific industry verticals—law or accounting or manufacturing of pharmaceuticals—on top of the foundation models.
Again, none of this means the technology itself won’t be transformative. History of examples of technologies that have radically changed society, but where the profits for the companies building out that technology were uncertain at best. The railroads are one example. Civil aviation is another. And the internet boom of the late 1990s is yet another. All those technologies were real and transformative. But in each case, a large number of companies driving the technology forward were lousy investments, with many going belly-up. AI may be no different. But investors shouldn’t write off the impact of AI just because CoreWeave’s IPO flopped.
This story was originally featured on Fortune.com