OpenAI turns to AWS to expand beyond Microsoft partnership

April 17, 2026 OpenAI is telling staff to prioritise expanding its partnership with Amazon Web Services, signalling a strategic shift away from its long-standing reliance on Microsoft. The move reflects growing demand from enterprise customers who prefer more flexible cloud environments such as Amazon’s Bedrock platform.

In a memo to staff, Chief Revenue Officer Denise Dresser described Microsoft’s role as “foundational” but also a constraint, writing that the partnership “has also limited our ability to meet enterprises where they are – for many that’s Bedrock.” The message underscores a recalibration rather than a break, as OpenAI looks to broaden how and where its models are deployed.

The shift follows months of gradual diversification. OpenAI has already begun using multiple hyperscalers for compute and is now actively positioning AWS as a key distribution channel for enterprise customers. According to Dresser, demand since the February partnership announcement has been “frankly staggering,” suggesting strong interest from organisations that are not deeply tied to Microsoft’s ecosystem.

That demand aligns with OpenAI’s current revenue mix. The company says it is generating roughly $2 billion per month, with enterprise customers accounting for about 40 per cent. Expanding beyond Microsoft’s platform gives OpenAI access to a wider pool of businesses, particularly those standardised on AWS infrastructure.

Competition is also shaping the strategy. Anthropic has been pushing aggressively into enterprise markets with its Claude models, prompting OpenAI to respond more directly on distribution and positioning. Dresser criticised Anthropic’s approach as relying on “fear, restriction, and the idea that a small group of elites should control AI,” highlighting intensifying rivalry at the top end of the market.

The shift points to a change in how AI providers scale. Rather than relying on a single cloud partner, leading model developers are increasingly aligning with multiple platforms to meet enterprise demand where it already exists. For customers, that means more choice in deployment environments while for providers, it introduces a new layer of competition tied as much to distribution as to model performance.



Top Stories

Related Articles

May 6, 2026 Ashley MacIsaac has filed a $1.5 million lawsuit against Google, alleging the company’s AI-generated search summaries falsely more...

May 6, 2026 The official White House mobile app for iOS and Android is facing scrutiny after a security researcher more...

May 6, 2026 Major banks are searching for ways to reduce their exposure to the enormous loans financing AI data more...

May 6, 2026 South Africa has withdrawn its Draft National Artificial Intelligence Policy after officials discovered that several academic references more...

Picture of Mary Dada

Mary Dada

Mary Dada is the associate editor for Tech Newsday, where she covers the latest innovations and happenings in the tech industry’s evolving landscape. Mary focuses on tech content writing from analyses of emerging digital trends to exploring the business side of innovation.
Picture of Mary Dada

Mary Dada

Mary Dada is the associate editor for Tech Newsday, where she covers the latest innovations and happenings in the tech industry’s evolving landscape. Mary focuses on tech content writing from analyses of emerging digital trends to exploring the business side of innovation.

Jim Love

Jim is an author and podcast host with over 40 years in technology.

Share:
Facebook
Twitter
LinkedIn