Product

Microsoft Azure

Cloud carrying OpenAI as captive distribution — H100/H200/Maia capacity, multi-hundred-billion OpenAI commitments, Azure AI revenue the fastest-growing line item at Microsoft.

1. Core Product / Service

Microsoft Azure is Microsoft's cloud business; the AI-relevant slice is the ND-series GPU VM family + the Azure OpenAI Service / Azure AI Foundry managed-model layer.

Compute SKUs that matter for AI:

  • NDv5 H100 (Standard_ND96isr_H100_v5) — 8× NVIDIA H100 80GB SXM, 3.2 Tbps NDR InfiniBand, deployed in non-blocking clusters of thousands [2].
  • ND H200 v5 — 8× NVIDIA H200 141GB; rolled out 2025.
  • ND GB200 v6 / NDv6 Blackwell — 8× B200 (or GB200 NVL72 rack-scale); GA through 2025/2026.
  • Maia 100 — Microsoft's first custom AI accelerator (~5nm, ~105B transistors), built for OpenAI inference workloads on Azure; Cobalt 100 ARM CPU pairs with it [7].
  • Azure OpenAI Service — managed endpoints for GPT-4o, GPT-5, o-series, etc. Fronted by Microsoft's billing/SLA, runs on Azure's GPU + Maia stack.
  • Azure AI Foundry — superset platform (model catalog, agent runtime, fine-tuning).

Capacity strategy: NVIDIA primary, Maia ramping for OpenAI inference, coreweave used as overflow (Microsoft was ~67% of CoreWeave FY2025 revenue).

2. Target Users & Pain Points

  • OpenAI — the single largest tenant; multi-hundred-billion-dollar Azure commitment over the partnership term [4][6].
  • Enterprise — Fortune 500 standardizing on Azure OpenAI Service for compliance-friendly LLM access (HIPAA, FedRAMP, EU data residency).
  • Microsoft 365 Copilot infrastructure — internal captive workload sitting on Azure GPU + Maia.
  • GitHub Copilot — also captive, served from Azure.

Pain points addressed: tightest OpenAI integration available (latency, model exclusivity windows), enterprise compliance, packaged "Copilot for X" SKUs sitting on top.

3. Competitive Landscape

Provider Differentiation vs Azure
aws Largest cloud overall, Anthropic anchor, Trainium silicon
google-cloud TPU silicon, Gemini 1P, Anthropic also a customer
oracle-cloud Stargate $500B with OpenAI; aggressive GPU pricing; OpenAI now a multi-cloud customer
coreweave Microsoft's largest overflow vendor; ~67% of CoreWeave FY2025 revenue
nebius Microsoft anchor customer (~$19.4B contracted); EU footprint

Azure's edge: OpenAI exclusivity for many model classes (now narrowing), enterprise distribution via Microsoft 365, Maia silicon optionality. Disadvantage: OpenAI is restructuring the partnership (no longer exclusive cloud beyond a "right of first refusal" on incremental capacity), and Microsoft has had to lean on CoreWeave/Nebius for capacity it couldn't build fast enough.

4. Unique Observations

  • NDv5 H100 pricing: pay-as-you-go list $98.32/instance·hour for Standard_ND96isr_H100_v5 (8× H100) — ~$12.29/H100·hour [1][2]. 1-year reserved ~$57/instance·hour ($7.13/H100·hour); 3-year reserved $33/instance·hour ($4.13/H100·hour). Enterprise Agreements with multi-year capacity commits clear ~$2.50–$4/H100·hour for true-frontier customers — Microsoft does not publish these.
  • ND H200 v5 pricing: list $108/instance·hour ($13.50/H200·hour) for 1-week Capacity Reservations; 3-year $36/instance·hour ($4.50/H200·hour). B200 NDv6 not yet on public retail pricing as of 2026-05.
  • OpenAI deal evolving terms (Oct 2025 restructure): Microsoft's exclusivity ended; replaced with a right of first refusal on new compute capacity OpenAI procures, plus a stake in OpenAI's restructured for-profit. Microsoft committed an additional ~$250B of Azure capacity over the partnership; OpenAI, in return, must consume that capacity and share revenue [4][5][6]. The deal also explicitly authorized OpenAI to use other clouds (Oracle Stargate, Google Cloud, CoreWeave directly) — fundamentally weakening Microsoft's captive-distribution moat.
  • Capacity source mix: NVIDIA dominant (85%+ of GPU spend), Maia ramping (5–10% est.), AMD MI300X selectively deployed. Microsoft does not disclose split.
  • AI revenue share: Microsoft ended FY2026 Q3 (Mar 2026) with Azure AI services run-rate ~$13B annualized (CEO Nadella commentary, Jan 2026 earnings) [8]. Total Azure (Intelligent Cloud segment) Q3 revenue ~$28B; AI thus ~12–15% of Azure proper. Microsoft Cloud revenue $42.4B Q3, +21% YoY [3]. Microsoft does not break out AI revenue precisely; the $13B figure is the most concrete public number.
  • Customer concentration: OpenAI is the customer; the precise share of Azure AI revenue from OpenAI is undisclosed but plausibly 40–60%. The risk profile mirrors coreweave's Microsoft-concentration: Azure AI's growth is heavily tied to one lab, even after the Oct 2025 restructure technically loosened exclusivity. Microsoft also runs internal captive load (Copilot, Bing, M365) which by some measures is comparable in size to OpenAI's external spend.

5. Financials / Funding

  • Parent: Microsoft (NASDAQ: MSFT); Azure reported under "Intelligent Cloud" segment with Server Products & Cloud Services.
  • Q3 FY2026 (Jan-Mar 2026, reported Apr 2026): Microsoft Cloud revenue $42.4B (+21% YoY); Azure & other cloud services +33% YoY (CC); Azure AI services run-rate ~$13B annualized [3][8].
  • FY2025 (Jul 2024–Jun 2025) Microsoft Cloud: ~$144B (FY full year); Intelligent Cloud segment ~$105B [3].
  • OpenAI commitments: Microsoft's cumulative + new commitment runs to ~$250B of Azure capacity post-Oct 2025 restructure [6]. Microsoft has invested ~$13B in OpenAI cumulatively (2019–2023) and now holds an equity stake in the restructured for-profit entity [4][5].
  • Capex: Microsoft FY2026 capex on track for $100B+, with "majority" AI infrastructure (Nadella commentary).
  • Maia / Cobalt silicon: announced Nov 2023; production volumes not disclosed.

6. People & Relationships

  • Microsoft CEO: Satya Nadella.
  • EVP, Cloud + AI: Scott Guthrie.
  • Anchor AI partner: OpenAI (Sam Altman, Greg Brockman) — restructured Oct 2025 to non-exclusive.
  • Other AI partners on Azure / Foundry: Mistral, Meta (Llama), Cohere, Stability, NVIDIA (NIM hosting).
  • NVIDIA: largest GPU supplier; Jensen + Nadella have publicly co-presented NDv5/v6 launches.
  • Overflow capacity vendors: coreweave (67% of CW FY2025 rev), nebius ($19.4B contracted).
  • Competitors: aws, google-cloud, oracle-cloud.

Sources

[1] https://azure.microsoft.com/en-us/pricing/details/virtual-machines/linux/ (2026-05-10) [2] https://learn.microsoft.com/en-us/azure/virtual-machines/sizes/gpu-accelerated/ndh100v5-series (2026-05-10) [3] https://www.microsoft.com/en-us/Investor/earnings/FY-2026-Q3/press-release-webcast (2026-05-10) [4] https://blogs.microsoft.com/blog/2025/10/28/microsoft-and-openai-evolve-partnership-to-deliver-the-next-phase-of-ai/ (2026-05-10) [5] https://openai.com/index/built-to-benefit-everyone/ (2026-05-10) [6] https://www.reuters.com/technology/artificial-intelligence/microsoft-openai-restructure-partnership-2025-10-28/ (2026-05-10) [7] https://news.microsoft.com/source/features/innovation/maia-100-cobalt-100-azure-custom-silicon/ (2026-05-10) [8] https://www.cnbc.com/2026/01/27/microsoft-q2-earnings-azure-cloud-ai.html (2026-05-10)

Last compiled: 2026-05-10