M.G. Siegler •

OpenAI Sells Its Stake to Win AI

They need NVIDIA to beat Google, and Microsoft may live to regret it...
OpenAI Sells Its Stake to Win AI

Keep your cloud partners close, and your GPU provider closer...

Following the breaking news yesterday about NVIDIA making a $100B "investment" into OpenAI, it hasn't taken long for many more details to trickle out. I wrote about some of them in the updates area of my post yesterday. But a report today by MacKenzie Sigalos for CNBC has enough details to dive into separately.

First and foremost, while the two sides obviously have a long-standing relationship – including a smaller investment by NVIDIA into OpenAI a year ago – this expansion of their ties seemed to happen quick. Per the report, it largely stemmed from a series of one-on-one meetings between Sam Altman and Jensen Huang – sorry bankers, no fees for you! – some of which took place while they were both on the latest tech trophy tour with President Trump. That clearly also allowed them to give him a heads-up on the deal, which will absolutely help to ease the risk of the regulatory hammer falling here.

And that matters because well, the most valuable company in the world is buying a large stake in the most valuable startup in the world. And, of course, these are arguably the two most important players in AI. Someone check to see if Lina Khan and Tim Wu are okay today...

Just how big of a stake will NVIDIA get here?

The financing structure for the OpenAI deal is designed to avoid hefty dilution. The initial $10 billion tranche is locked in at a $500 billion valuation and expected to close within a month or so once the transaction has been finalized, people familiar with the matter said. Nine successive $10 billion rounds are planned, each to be priced at the company’s then-current valuation as new capacity comes online, they said.

This backs up earlier reports by Sri Muppidi for The Information and Kif Leswing & Ashley Capoot of CNBC who had the first investment being done at the $500B valuation (which seemingly validates that number with a primary capital round versus simply the secondary round that is happening right now) and future amounts being done at the company's valuation at the time of the new investments coming in, respectively. That first $10B at $500B is a nice, clean 2%.

And this continues the trend of Big Tech buying into Big AI. While 2% is not the ~30% that Microsoft is likely to own (upon conversion) in OpenAI, or even the 12% that SoftBank is likely to own (after they complete their promised tranches), it's clearly going up from there. Just how high is a matter of OpenAI's valuation at the time of the milestones being hit for capacity. And because there are nine (!) of those tranches, it will obviously take years – which was my guess and point yesterday – for all the money to come in, if it ever does.

Just for fun, I asked ChatGPT to estimate how long it would take for the full investment to be met. It gave me this:

A reasonable guess: 5–7 years to fully deploy the $100B.

Assuming the first 10‑GW site goes live in the back half of next year and each successive $10B tranche aligns with new capacity coming online, a phased rollout of multiple mega‑sites, grid connections, permitting, and manufacturing cadence likely spreads across mid‑2030 to early‑2032. Front‑loaded years might see 2–3 tranches, with later years slowing as projects scale and diversify locations. Delays from energy, supply chain, or financing could push it closer to the upper bound.

Sounds reasonable. Though obviously a lot could change between now and then. Think about how much has changed in just the past year since NVIDIA's first investment! For even more fun, I asked what OpenAI's valuation might be at each stage. There, it gave me a roughly $100B - $150B step-up in price every year – all the way up to $1.5T in the final year (2032). Honestly, that's probably too conservative, at least in the next couple of years given what we've seen the past few years. Then again, if the economy does shift – which it obviously will at some point in that span – it could end up looking too rosy. Especially since by that point, OpenAI will undoubtedly be a public company – unless something goes horribly awry... Again, it just depends on too many external factors to know for sure. So let's just go with the assumption.

I then asked what NVIDIA's ownership in OpenAI may look like at that point. And guess what came back? Around 12%. Yes, the same as SoftBank's targeted ownership after their current round is complete. That's either an interesting coincidence or the math OpenAI/NVIDIA were using as well when structuring this deal...

It won't be this exact because of dilution, if nothing else, but that would leave an OpenAI in the early 2030s that is nearly one-third owned by Microsoft, with SoftBank and NVIDIA taking up another roughly 25% combined. That would be over half being owned by Big Tech – if you want to consider SoftBank "Big Tech". But even without them, just Microsoft and NVIDIA would own almost half of OpenAI between them in this scenario. Again, not including dilution – most notably during any IPO process. And whatever those companies do with such holdings if/when OpenAI is actually public.

And assuming OpenAI is public by then, NVIDIA would have quite the portfolio of public companies – on top of the (theoretical) OpenAI stake, they own around 7% of CoreWeave, and now almost 5% of Intel. And some of their other private holdings – xAI, Mistral, Perplexity, Cohere, Scale AI, etc – could be public companies by then too...

Anyway, back out of that rabbit hole... this also feels important for this investment:

In Monday’s announcement, OpenAI described Nvidia as a “preferred” partner. But executives told CNBC that it’s not an exclusive relationship, and the company is continuing to work with large cloud companies and other chipmakers to avoid being locked in to a single vendor.

Such statements also feel like they are to ease regulatory concerns as much as anything. Especially since it logically makes sense that a tie-up of this nature between these two sides would naturally lead OpenAI to be at least a little less focused on any work on their own AI chips. Yes, they'll still want to hedge, but the need to hedge is less acute now.

The $100 billion commitment from Nvidia represents only part of what’s required for the planned 10-gigawatt buildout. OpenAI will lease Nvidia’s chips for deployment, but financing the broader effort will require other avenues. Executives have called equity the most expensive way to fund data centers, and they say the startup is preparing to take on debt to cover the remainder of the expansion.

Yes, even $100B isn't going to be enough for OpenAI's needs here, as wild as that sounds. Again, only 10% is coming at first anyway, but still, they're going to need more than $100B to do everything they want to do. And so they're going to be playing with debt (and SoftBank's debt), which makes sense right now in this environment, but could certainly come back to bite them, as it so often does when things turn... Again, perhaps another reason to want to be public at that point...

As OpenAI’s compute necessities increase, a big question is where the company will host its workloads, which have to date been largely housed in Microsoft Azure. Taking the work in-house would push OpenAI closer to operating as a first-party cloud provider, a market led by Amazon Web Services, followed by Azure, Google and Oracle.

Executives have openly floated the idea, suggesting it may not be far off. Some even indicated to CNBC that a commercial cloud offering could emerge within a year or two, once OpenAI has secured enough compute to cover its own needs. For now, demand for training frontier models leaves little capacity to spare, but OpenAI isn’t done looking for new opportunities.

This was the entire point of my piece a couple weeks ago entitled: OpenAI Needs to Build Google Cloud Before Google Can Build ChatGPT...

This is happening and as much as OpenAI may have liked to control the entire stack, as it were, of this build-out, it would have been impossible with say, all of their own chips. Even Google doesn't use all of their own chips, obviously. Still, Google has a massive and perhaps under appreciated long-term advantage right now in AI thanks to both their cloud and their own in-house TPUs.1 And so OpenAI more closely bringing in the partner they're perhaps the most reliant on right now – with Microsoft on the outs! – makes sense in that regard. They can't reasonably combat NVIDIA right now in chips. But they need to combat Google in AI, and specifically in build-out and cost...2

I'll jump to the ending of that pieces about OpenAI building out their own cloud...

If they can execute on their crazy spend plan and get not just one, but many Stargates up and running at scale, it will all be worth it. But the issue right now is that Google is already building Gemini with their own cloud and their own chips at scale. That would seem to be a problem – certainly if any sort of market turn happens and it becomes a challenge to raise, say, $100B.

$100B you say? Interesting number...

One more thing: speaking of Microsoft, back to Sigalos...

While OpenAI gets more intimate with Nvidia, it has to maneuver through a number of high-stakes relationships with other key partners.

OpenAI only informed Microsoft, its principal shareholder and primary cloud provider, a day before the deal was signed, the people familiar with the matter said. Earlier this year, Microsoft lost its status as OpenAI’s exclusive provider of computing capacity.

On one hand, at least Microsoft got some heads up this time, I guess. On the other, do you get the sense that Microsoft is going to regret letting this relationship go so far south? Yes, they still have their ownership stake (or will, presumably) and yes, they still get some access to the OpenAI technology, but they're now watching SoftBank and Oracle and NVIDIA and many others (even Apple!) give OpenAI these big bear hugs. So they find themselves in this weird position where they'd be perhaps not happy, but a bit relieved if it all goes south for OpenAI in that they made the right "bet" to take a step back and outsource all of OpenAI's needs, including now the cloud. Because if the opposite happens, OpenAI could be bigger than Microsoft...

You laugh now, but this guy isn't...

👇
Previously, on Spyglass...
NVIDIA (Intends to) Invest (Up to) $100B in OpenAI (Over Time)
Look, it’s a massive deal. But right now, it’s all about the optics…
OpenAI & Microsoft Agree to Agree, Tentatively
An important step – but one of many – as they clearly try to race towards PBC conversion…
OpenAI Needs to Build Google Cloud Before Google Can Build ChatGPT...
Or: how to burn $100B+ in real time…

1 TPUs which, mind you, Google may be exploring the sale of to other clouds so as to better compete with you-know-who...

2 And yes, to a lesser extent, Amazon – especially since they're so tightly partnered with Anthropic, including increasingly on their own in-house chips. And yes, xAI to some extent too as Elon Musk can also seemingly raise endless capital and may soon be bringins Tesla into that fold, after already doing so with SpaceX...