OpenAI announced a fresh financing round totaling $110 billion, with Amazon committing $50 billion and Nvidia and SoftBank each putting in $30 billion. The deal values the company at roughly $730 billion pre-money and includes a strategic cloud partnership with Amazon and an agreement with Nvidia on next‑generation inference hardware.
The financing and commercial pacts come as OpenAI revises its capital‑expenditure plan for compute. Management told investors it now expects cumulative compute spending to 2030 of about $600 billion, a substantial reduction from the $1.4 trillion figure flagged by CEO Sam Altman earlier. The company says the more modest plan will be linked more directly to revenue growth expectations.
OpenAI provided updated operating metrics to investors: projected 2030 revenues of more than $280 billion and an expectation of positive cash flow by that year. For 2025 the company reported revenue of $13.1 billion, a cash outflow of $8 billion and a gross margin of 33%, and said it expects consumer and enterprise businesses to contribute roughly equally to future sales.
User engagement remains strong. ChatGPT weekly active users are now reported above 900 million, up from about 800 million in October, while the programming assistant Codex has more than 1.5 million weekly users. Still, the company has been operating under heightened internal pressure since competitors such as Google and Anthropic intensified their product rollouts, prompting OpenAI to enter a period of intensified product development last December.
For Amazon and Nvidia the stakes are both commercial and strategic. Amazon secures a deeper tie between OpenAI’s software stack and AWS, while Nvidia’s investment and inference agreement anchor its lead in AI accelerators. SoftBank’s participation signals continued appetite among large institutional backers to gain exposure to generative AI winners.
The round reshapes the economics and politics of the AI landscape. Large direct investments by cloud and hardware suppliers blur lines between platform vendor and customer, raising questions about preferential access to future models and compute capacity. Regulators and enterprise customers will be watching whether these commercial alignments distort competition in a sector where access to scale and specialised inference hardware is a decisive advantage.
OpenAI’s revised capex path and the new cash infusion give it a runway to pursue product improvement and model scaling without the earlier headline‑grabbing spending projections. Yet the plan depends on aggressive revenue growth assumptions and smooth commercial execution with cloud partners, and it will intensify scrutiny over governance, data access and the balance of power between AI model developers and the infrastructure providers that now hold equity in them.
