Amazon has struck a landmark strategic partnership with OpenAI that could see the e-commerce and cloud giant invest up to $50 billion and cement a deeper technical tie between the two firms. Under the deal OpenAI will increase its use of Amazon Web Services and commit to deploying 2 gigawatts of Amazon’s Trainium AI chips on its new enterprise “Frontier” platform, signalling a substantial hardware commitment to AWS.
The investment will be staged: an initial $15 billion from Amazon followed by a conditional $35 billion tranche that depends on undisclosed milestones and the completion of an IPO or direct listing in the United States. Nvidia and SoftBank are separately expected to invest $30 billion each, valuing OpenAI at a pre-money figure of roughly $730 billion, according to regulatory filings accompanying the arrangement.
The pact marks a strategic shift for Amazon. Until now, AWS had been closely allied with Anthropic — a leading OpenAI competitor — receiving billions in funding and hosting a major $11 billion Project Rainier data-centre campus in Indiana. Amazon’s consumer-facing AI features such as the Rufus shopping assistant and upgrades to Alexa have relied on Anthropic’s Claude models, a relationship Amazon’s chief executive, Andy Jassy, says will continue unchanged even as the company forges a long-term relationship with OpenAI.
For AWS this is a clear commercial win. The cloud provider is in fierce competition with Microsoft, Google and Oracle for high-margin AI cloud business, and an endorsement from OpenAI helps close an important gap. It also strengthens Amazon’s argument that its heavy 2026 capital spending — forecast at around $200 billion and focused largely on AI infrastructure, chips and networking — is a necessary investment to capture the next wave of cloud demand.
More strategically for Amazon, the deal accelerates adoption of its Trainium custom chips. Analysts contend that commitments from leading AI labs to run on Trainium make Amazon a more important player in bespoke AI silicon and place it in direct competitive tension with established custom-chip houses such as Broadcom and Google — and, potentially, with Nvidia’s entrenched GPU ecosystem.
There are caveats. The second $35 billion tranche is conditional on milestones that remain undisclosed; media reports have speculated — without official confirmation — that one of those benchmarks could be progress toward artificial general intelligence (AGI). Tying investment to such a goal would raise novel questions about incentives, safety oversight and regulatory scrutiny. The agreement also includes a termination clause: if the second tranche is not completed by December 31, 2028, the deal can lapse.
The combined capital injections from Amazon, Nvidia and SoftBank reshape the incentive map for OpenAI, giving it more diversified backers and multiple hardware partners. For AWS, the immediate payoff is both commercial (more OpenAI workloads on its platform) and strategic (larger Trainium deployments that validate its in-house silicon bets). For rivals and the broader market, the arrangement intensifies the race to offer cloud stacks optimised for generative AI while underscoring how major cloud providers are using chips and exclusive infrastructure relationships to try to lock in customers.
