On April 28, 2026, Amazon Web Services and OpenAI formally ended one of the most consequential exclusivity arrangements in the history of cloud computing. After nearly seven years in which Microsoft Azure was the only hyperscaler legally permitted to host OpenAI’s proprietary large language models, GPT-5.5 and GPT-5.4 are now available on Amazon Bedrock in limited preview. The launch came one day after OpenAI and Microsoft restructured their partnership agreement, replacing perpetual exclusivity with a nonexclusive license running through 2032 and formally allowing OpenAI to expand to competing cloud platforms for the first time.
Key Highlights
- OpenAI’s models including GPT-5.5 and GPT-5.4 are now available on Amazon Bedrock in limited preview
- Microsoft’s exclusive right to host OpenAI models has been replaced with a nonexclusive license through 2032
- Microsoft will stop paying a revenue share to OpenAI; OpenAI continues paying Microsoft through 2030, now subject to a cap
- Amazon launched Codex on AWS Bedrock and Amazon Bedrock Managed Agents powered by OpenAI simultaneously
- The multi cloud expansion clears a key concern for investors ahead of OpenAI’s expected IPO later in 2026
- OpenAI has surpassed $25 billion in annualized revenue as of April 2026
- Morgan Stanley said the move compresses Azure’s AI growth premium but eases OpenAI’s path to a public listing
What Changed in the OpenAI and Microsoft Deal
The original Microsoft and OpenAI partnership gave Azure exclusive rights to host OpenAI’s models in exchange for billions of dollars in cloud compute investment. That arrangement made Microsoft the default infrastructure layer for every enterprise that wanted to deploy OpenAI’s frontier models at scale.
The restructured agreement keeps Microsoft as a major partner but removes the cloud exclusivity entirely. Microsoft retains a 27 percent equity stake in OpenAI’s restructured public benefit corporation and continues to receive a revenue share through 2030, though that share is now capped. In exchange, OpenAI gains the freedom to bring its models to any cloud provider. AWS is the first beneficiary, and the structure permits additional cloud partnerships.
TechCrunch reported that the new deal also resolves the legal uncertainty Microsoft faced over OpenAI’s $50 billion Amazon infrastructure agreement. The restructuring permits multi cloud distribution as a contractual right rather than a disputed arrangement, ending the conflict cleanly for both parties.
What AWS Is Getting
Amazon is not simply getting access to GPT-5.5 and GPT-5.4 as hosted models. The AWS launch includes three distinct products: the frontier models on Bedrock, Codex on AWS Bedrock (OpenAI’s coding agent), and Amazon Bedrock Managed Agents powered by OpenAI. All three launched simultaneously in limited preview.
The Managed Agents product is the most significant for enterprise buyers. It allows organizations to deploy AI agents that can execute multi step tasks using OpenAI’s reasoning models within AWS’s existing security and compliance infrastructure. For large enterprises already operating on AWS, the ability to run frontier AI agents inside their existing cloud environment removes a major procurement barrier. Previously, those enterprises had to choose between their existing AWS infrastructure and access to OpenAI’s best models. That choice no longer exists.
What It Means for Microsoft Azure
Azure built a substantial portion of its AI growth narrative on the premise that access to OpenAI’s models required Azure infrastructure. That premise is now gone. Morgan Stanley analysts estimated the shift “modestly compresses Azure’s AI growth premium,” though they noted the compression is “more than offset by reduced concentration risk and a clearer path to OpenAI IPO outcomes” for Microsoft as an equity holder.
The more direct impact is on enterprise sales cycles. Azure sales teams have lost the ability to use OpenAI exclusivity as a differentiator. Enterprises shopping for AI infrastructure now face a genuine choice between Azure and AWS for powered by OpenAI workloads, and that competition will compress margins and accelerate feature development at both platforms.
As enterprise AI adoption accelerates, the number of organizations deploying frontier models at production scale is growing rapidly. A larger market with more competition is ultimately better for buyers than a market where access to the best models was gated through a single provider.
The IPO Angle
OpenAI has surpassed $25 billion in annualized revenue as of April 2026 and is taking early steps toward a public listing, potentially as soon as late 2026. The exclusivity arrangement with Microsoft was a potential concern for IPO investors because it created structural dependency on a single cloud partner and concentrated revenue risk.
The multi cloud expansion addresses that concern directly. An OpenAI with models on AWS, Azure, and potentially additional platforms is structurally less dependent on any single partner’s cloud growth trajectory. Morgan Stanley’s note connected the dots explicitly: clearing OpenAI’s path to IPO is in Microsoft’s interest as a 27 percent shareholder, even if it dilutes Azure’s short term AI competitive position.
The timing matters. Listing in 2026 requires demonstrating that OpenAI’s revenue model is durable and diversified, not tied to one cloud provider’s enterprise relationships. The scale of enterprise AI investment flowing into infrastructure suggests the market is large enough to support multiple distribution channels without cannibalizing any single partner’s business.
What Enterprise AI Buyers Should Know
The practical implication for organizations evaluating AI infrastructure is straightforward. GPT-5.5 and GPT-5.4, the most capable models available from OpenAI as of April 2026, are now accessible through both major US cloud providers.
For enterprises already embedded in AWS infrastructure, the Bedrock availability removes the last practical argument for migrating to Azure solely to access OpenAI models. For enterprises evaluating both clouds, AI model access is now neutral and other factors, including compute pricing, data residency, and existing vendor relationships, will drive the decision.
The broader enterprise AI deployment landscape is moving toward multi model strategies, where organizations mix providers based on task requirements rather than choosing a single AI vendor. OpenAI’s multi cloud move accelerates that trend by removing the infrastructure lock in that previously forced a single vendor commitment for enterprises that wanted GPT on production workloads.
The Competition Context
OpenAI’s arrival on AWS Bedrock also intensifies competition with Anthropic, whose Claude models have been available on Bedrock since 2023 and represent Amazon’s primary AI partnership. AWS now offers both Anthropic and OpenAI frontier models on the same platform, giving enterprise buyers direct access to both the leading AI providers inside a single cloud environment.
That dynamic benefits AWS more than either AI company. Amazon becomes the neutral ground where enterprises can evaluate and deploy across multiple frontier model providers without changing cloud infrastructure. As model performance converges at the top of the benchmark rankings, the infrastructure and tooling layer around the models becomes the primary competitive surface, which is exactly where AWS has its deepest enterprise relationships.
Across the enterprise software industry, the race to embed AI into every workflow is now running on open cloud infrastructure rather than closed partnerships, and that shift benefits buyers more than any single vendor in the short term.
The TCB View
OpenAI’s multi cloud move is straightforwardly good for the enterprise AI market and straightforwardly challenging for Microsoft’s AI competitive narrative. The interesting tension is that Microsoft benefits in both directions simultaneously: as Azure loses its OpenAI exclusivity advantage, Microsoft’s equity in OpenAI increases in value because the IPO path gets clearer. That dual exposure means Microsoft’s financial interest was always aligned with eventually allowing this transition, even if the Azure business unit would have preferred perpetual exclusivity. The restructuring reflects a maturation of the Microsoft and OpenAI relationship from a high dependency startup partnership to a more conventional equity and licensing arrangement between large commercial entities. The most consequential long term effect is not which cloud wins more OpenAI workloads. It is that the enterprise AI infrastructure layer is becoming competitive, which accelerates deployment timelines and reduces buyer leverage risk. That is good for the industry, and the broader macro disruption happening simultaneously makes the stability of AI infrastructure investment a notable counterpoint to the geopolitical turbulence in energy markets this week.
Free Daily Briefing
Get the Daily Briefing
Crypto, AI, and Web3 intelligence. Free, every day.
The Daily Brief by TCB
Crypto, AI & finance intelligence in 5 minutes. Every weekday morning. Free.

