Nvidia is going all in on OpenAI, pledging to invest up to $100 billion and ship a mountain of data-center silicon in a pact that binds the world’s most valuable chipmaker to the company behind ChatGPT. The two signed a letter of intent for a strategic partnership that would stand up at least 10 gigawatts of Nvidia systems for OpenAI’s next-gen AI infrastructure, with the first phase slated to go live in the second half of 2026 on Nvidia’s upcoming Vera Rubin platform.
Under the structure taking shape, money and hardware flow in both directions. OpenAI will pay cash for Nvidia’s servers and accelerators. Nvidia, in turn, will take a non-controlling equity stake in OpenAI, beginning with a $10 billion tranche once a definitive chip purchase agreement is inked. Deliveries are targeted to start as soon as late 2026, and each incremental gigawatt deployed would be matched by additional Nvidia investment, up to the $100 billion cap.
For OpenAI, this is about locking in compute at unprecedented scale. Sam Altman summed it up in characteristically blunt fashion:
“Everything starts with compute.”
The company wants to train and run the next wave of models — and keep the app that popularized generative AI out in front — without being throttled by the global GPU shortage. Nvidia frames it as the next chapter in a decade-long partnership, from the first DGX boxes to ChatGPT’s breakout, now scaled to “deploying 10 gigawatts to power the next era of intelligence,” as CEO Jensen Huang put it.
The market liked what it heard. Nvidia shares popped more than 4% after the announcement, while Oracle — already tied to OpenAI, SoftBank and Microsoft through the $500 billion “Stargate” mega–data center effort — also gained. The new deal doesn’t replace OpenAI’s Microsoft relationship, nor does it shut the door on custom silicon projects the startup has explored with Broadcom and TSMC. Think of it as belt-and-suspenders capacity planning in a world where top-tier AI training runs can soak up millions of GPUs.
Technically and commercially, the plan is ambitious. Ten gigawatts translates into millions of accelerators plus matching networking, storage, and power — sited, permitted, and cooled — at a cadence that’s more like utility buildout than a typical cloud upgrade. Nvidia says the US entity will “co-optimize” roadmaps with OpenAI so hardware, system software, and model training pipelines evolve in lockstep. Oracle will remain a key compute and networking partner as OpenAI grows its “AI factory” footprint.
The tie-up lands amid a broader realignment. Nvidia just agreed to collaborate with Intel on AI chips and committed $5 billion to the struggling rival. Microsoft has poured billions into OpenAI over the years and recently reached a non-binding deal to restructure OpenAI into a for-profit. And across Big Tech, from Google to Amazon to Microsoft, in-house accelerators are meant to curb reliance on Nvidia even as demand for its GPUs keeps outrunning supply. In that context, this agreement both secures OpenAI’s runway and reinforces Nvidia’s position at the heart of frontier AI.
Regulators will notice the gravity here. The Justice Department and FTC carved up jurisdiction in 2024 to examine the dominance of Microsoft, OpenAI, and Nvidia in AI markets. The Trump administration has generally taken a lighter antitrust touch than its predecessor, but a fresh, nine-figure, cross-ownership and supply pact between two sector kingpins will invite questions about competition and ecosystem lock-in. Nvidia already backed OpenAI’s $6.6 billion round in 2024; this is an order of magnitude bigger and explicitly tied to hardware scale.
If the timelines hold, the first Vera Rubin systems for OpenAI will light up in late 2026, with capacity ramping in stages after that. Between now and then, both sides have to finalize the definitive agreements, line up grid power and sites, and ride out at least one more product cycle in a market moving at breakneck speed. The message, though, is unmistakable: compute is the currency of AI, and Nvidia and OpenAI just wrote themselves a very large check.
Reuters, CNN, Axios, and Open AI contributed to this report.
The latest news in your social feeds
Subscribe to our social media platforms to stay tuned