On Monday, OpenAI announced it has signed a seven-year, $38 billion deal to purchase cloud services from Amazon Web Services to power products like ChatGPT and Sora. It’s the corporate’s first big computing deal after a fundamental restructuring last week that gave OpenAI more operational and financial freedom from Microsoft.
The agreement gives OpenAI access to lots of of hundreds of Nvidia graphics processors to coach and run its AI models. “Scaling frontier AI requires massive, reliable compute,” OpenAI CEO Sam Altman said in an announcement. “Our partnership with AWS strengthens the broad compute ecosystem that can power this next era and produce advanced AI to everyone.”
OpenAI will reportedly use Amazon Web Services immediately, with all planned capability set to return online by the tip of 2026 and room to expand further in 2027 and beyond. Amazon plans to roll out lots of of hundreds of chips, including Nvidia’s GB200 and GB300 AI accelerators, in data clusters built to power ChatGPT’s responses, generate AI videos, and train OpenAI’s next wave of models.
Wall Street apparently liked the deal, because Amazon shares hit an all-time high on Monday morning. Meanwhile, shares for long-time OpenAI investor and partner Microsoft briefly dipped following the announcement.
Massive AI compute requirements
It’s no secret that running generative AI models for lots of of tens of millions of individuals currently requires a variety of computing power. Amid chip shortages over the past few years, finding sources of that computing muscle has been tricky. OpenAI is reportedly working by itself GPU hardware to assist alleviate the strain.
But for now, the corporate needs to search out recent sources of Nvidia chips, which speed up AI computations. Altman has previously said that the corporate plans to spend $1.4 trillion to develop 30 gigawatts of computing resources, an amount that is sufficient to roughly power 25 million US homes, based on Reuters.

