/connect-gujarat-english/media/media_files/2025/11/04/img_7025-2025-11-04-13-52-19.jpeg)
The deal marks the first major announcement since OpenAI’s restructuring last week, which freed the ChatGPT maker from its non-profit framework.
It also forms part of a broader strategy by OpenAI and its largest investor, Microsoft, to reduce their mutual dependence, according to sources.
Under the agreement, OpenAI will gain immediate and expanding access to AWS compute resources, including Amazon EC2 UltraServers equipped with thousands of Nvidia chips, and the capability to scale to millions of CPUs, OpenAI said in a blog post.
The seven-year partnership is aimed at helping OpenAI rapidly expand its compute capacity while leveraging AWS’s price, performance, scale, and security, it added.
AWS said its infrastructure for OpenAI will use state-of-the-art Nvidia GB200 and GB300 GPUs, clustered through EC2 UltraServers to enable high-efficiency AI processing across interconnected systems.
The deployment is expected to support both ChatGPT inference and the training of next-generation models, the company added.
According to AWS, all compute capacity under this partnership is expected to be deployed by the end of 2026, with the potential to expand further into 2027 and beyond.
Follow Us





































