M.G. Siegler •

The Mysterious "New Ideas" for AI Data Center Build Outs...

Perhaps not that mysterious, maybe more circular...
The Mysterious "New Ideas" for AI Data Center Build Outs...

At just 182 words, the latest Sam Altman blog post sort of reads like “well, we’re doing a $100B deal with NVIDIA, I guess someone should say something.” And, to be fair, it’s more substantive than the 49-word press releases that OpenAI and Microsoft jointly issued around their intent to reach a new deal. Still, by far the most interesting bit seemed to be at the very end when Altman wrote the following:

Over the next couple of months, we’ll be talking about some of our plans and the partners we are working with to make this a reality. Later this year, we’ll talk about how we are financing it; given how increasing compute is the literal key to increasing revenue, we have some interesting new ideas.

Not since the Jeffrey Epstein birthday card has a message been so vaguely intriguing. Has ChatGPT brought another wonderful secret to the world of finance? A mysterious new model for paying for expenditures? It’s possible! But actually, based on various other statements and reporting on the topic, it seems like we may be able to triangulate what they’re talking about already. Plus, as with all these AI data center dealings, which are always convoluted and sometimes circular, I feel the need to write them down just to wrap my own head around them.

First and foremost, the newly appointed co-CEO of Oracle, Clay Magouyrk, seemed to echo Altman’s statements about this mysterious new form of corporate finance when talking about data center dealings, as Cade Metz reports for The New York Times:

Oracle will pay for and oversee the construction of three of the new data centers. OpenAI will then purchase computing power from Oracle. Oracle’s co-chief executive Clay Magouyrk said that the cloud computing giant would pay for the construction of these facilities partly by exploring new kinds of financial deals with various partners, technology providers and other investors.

“It is a combination of working with all the right partners and providers to bring all of their capital bear as well as interesting new corporate structures and interesting new ways of doing financing,” he said.

That's now two statements around the general topic with two mentions of "interesting" – how interesting!

Oracle, of course, is a major OpenAI partner in various ways — most recently to the tune of an, um, $300B contract between the two for OpenAI to purchase server capacity in Oracle data centers from 2027 onward. This is a part of the previously announced Stargate project, which seems to be morphing in real time. The latest news on that front says that while Oracle will be in charge of three new massive data centers set to come online in the coming years, SoftBank will be in charge of another two. OpenAI will be a partner in all of those, obviously. As will NVIDIA, since their chips will be powering these data centers, naturally.

But the OpenAI/NVIDIA deal is a bit different, it seems. Most notably, it sounds like OpenAI will be the company building, owning, and operating a new data center (or possibly data centers) with upwards of 10GW of capacity.

In order to do this, OpenAI will need money, obviously. And part of that money will come from NVIDIA with their (up to) $100B commitment as capacity comes online, and starting with a $10B commit to kick off the project. For that capital, NVIDIA will get equity in OpenAI but it also seemingly unlocks a new business model for both companies. As reported by Anissa Gardizy and Sri Muppidi of The Information:

OpenAI and Nvidia are discussing an unusual way to structure their new artificial intelligence data center partnership, under which OpenAI would lease Nvidia’s AI chips rather than buying them, according to two people who spoke to executives at the companies about it.

This reporting has subsequently been backed up by Tim Bradshaw, George Hammond, and Stephen Morris for The Financial Times as well. And while it’s not exactly a new concept for OpenAI since they also lease/rent server capacity from many clouds at the moment, the difference here is the ability to lease the chips directly from NVIDIA for their own data centers as mentioned above. And for NVIDIA to lease these to a data center rather than selling them outright to the likes of Microsoft, Google, Amazon, Oracle, and the like. This also includes their big “neocloud” partners like CoreWeave.1

Such an arrangement should allow OpenAI to save money — 10% to 15% by some estimates — which at the levels we’re talking about, is a lot of money! It also means OpenAI won’t be left holding the inventory bag as these chips depreciate. And because OpenAI will always want to be on the cutting edge with the latest NVIDIA chips, this model seemingly makes more sense for them. And the assumed promised of ongoing contracts for cutting-edge chips to OpenAI means a constant and consistent revenue stream for NVIDIA, who presumably can also re-purpose the older chips better than OpenAI would be able to. (Oh yes, and getting that nice equity slug in OpenAI helps grease all these wheels too.)

NVIDIA in turn can use these contracts with OpenAI to take out debt to buy servers from other partners like Dell to help build the full systems for OpenAI to lease. One (major) problem with OpenAI trying to build their own data center was that any debt provider is going to be awfully wary of lending them money that they’re uncertain they can pay back — because OpenAI, of course, has no actual profits. And doesn’t project having any such profits until at least 2030 now.

And yet debt is how you do these deals. Even companies like Meta, which gushes profits, uses debt for these build-outs. And so NVIDIA is in a way becoming OpenAI’s way to build a massive data center without needing to raise their own debt. NVIDIA is raising it, backed by… OpenAI’s contracts!

But wait, that contract also isn’t being paid for with profits because again, OpenAI doesn’t have any. But they do have a lot of capital thanks in part to… NVIDIA! And that’s obviously part of why some are dubious of such deals. But it’s actually more tangled with this debt in the mix. Again, NVIDIA is giving OpenAI money in part to lease chips from NVIDIA which NVIDIA will then use to raise debt to finance the build out of a data center that OpenAI will own.

I mean...

At this point I have to assume there will be another wonderful secret of finance in all this. Because a way to lease GPUs directly from NVIDIA in order to finance data center build-outs is somewhat interesting in its circular nature, but not that exciting. Perhaps that excitement comes from how these players think it can scale? I’m guessing we’ll hear something about how this will create a virtuous cycle where AI demand funds AI build out which fuels AI demand. But if any part of that cycle were to slow — let alone break… Hopefully an NVIDIA always pays its debts.


1 Though the deal with CoreWeave is even more complicated in that NVIDIA has a deal to “lease back” capacity from them if capacity goes unsold — yes, NVIDIA leases back capacity on NVIDIA chips owned by CoreWeave — that’s how circular this all is!