NVIDIA to Invest US $100 Billion in OpenAI Infrastructure

OpenAI and NVIDIA have announced a landmark strategic partnership to build massive new compute infrastructure that will power the next generation of artificial intelligence.

Under a new letter of intent, the two companies intend to deploy at least 10 gigawatts of NVIDIA systems for OpenAI’s future AI models.

As part of the agreement, NVIDIA plans to invest up to US$100 billion in OpenAI. The funding will be rolled out progressively as each gigawatt of AI data center capacity comes online. The first gigawatt deployment is expected in the second half of 2026, using the upcoming NVIDIA Vera Rubin platform.

NVIDIA founder and CEO Jensen Huang said the effort marks a leap forward in AI infrastructure deployment, recalling the long shared history between the firms from supercomputing beginnings to the breakthroughs with products like DGX and ChatGPT. OpenAI co-founder and CEO Sam Altman emphasized that compute infrastructure will serve as the foundation for future economic growth and innovation.

This partnership makes NVIDIA the preferred strategic partner for OpenAI’s compute and networking needs as OpenAI ramps up its “AI factory” growth plans. The roadmap will involve close coordination of both companies’ hardware and software development, ensuring that OpenAI’s models and infrastructure software and NVIDIA’s systems evolve in tandem.

Experts believe this agreement is one of the largest infrastructure investments in the AI field to date. It reflects accelerating competition among AI and chip companies to provide the scale of computing power required for advanced models and potential artificial general intelligence.

OpenAI and NVIDIA have not yet disclosed full details of where these datacenters will be located, or how all of the 10 gigawatts will be distributed globally. The deal is expected to be finalized in the coming weeks once all terms are settled.

Want to see more of our stories on Google?

Add iPhone in Canada as a Preferred Source on Google

P.S. Want to keep this site truly independent? Support us by buying us a beer, treating us to a coffee, or shopping through Amazon here. Links in this post are affiliate links, so we earn a tiny commission at no charge to you. Thanks for supporting independent Canadian media!

Subscribe
Notify of
guest
1 Comment
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
Lèon
Lèon
7 months ago

It’s bonkers that the AI infrastructure capacity is measured by electricity consumption instead of computational power

1
0
Would love your thoughts, please comment.x
()
x