Microsoft’s Maia 200 AI chip takes a swing at Nvidia’s CUDA lock-in as Azure rollout starts

January 26, 2026
Microsoft’s Maia 200 AI chip takes a swing at Nvidia’s CUDA lock-in as Azure rollout starts

San Francisco, January 26, 2026, 09:12 PST

  • Microsoft revealed Maia 200, its second-generation AI inference chip developed in-house, along with fresh software tools to support its programming
  • The company said initial deployments kick off this week in Iowa, followed by Arizona.
  • Microsoft is promoting Maia 200 as the platform to power OpenAI’s newest GPT-5.2 models and additional services within Azure

Microsoft revealed its Maia 200 chip on Monday, marking the second generation of its custom AI hardware. The company also introduced new software tools designed to chip away at Nvidia’s lead among developers. 1

This shift is crucial as the expense of running generative AI systems climbs rapidly, with cloud providers scrambling to manage both the availability and pricing of the hardware behind them. Nvidia remains the leader in AI computing, largely because a lot of developers rely on its CUDA software platform.

Microsoft designed Maia 200 specifically for “inference” — the phase when a trained model generates answers — not for training itself. Inference often drives up daily costs for chatbots and assistants since they create output token by token, with each token representing a small piece of text.

The company announced that Maia 200 will go live this week at a data center near Des Moines, Iowa, with plans for a second location near Phoenix, Arizona soon after. This model succeeds Maia 100, which Microsoft launched in 2023.

Microsoft revealed in a blog post that Maia 200 is built on TSMC’s cutting-edge 3-nanometer process, aiming to slash the cost of “AI token generation.” The chip packs 216GB of HBM3e memory to keep data flowing smoothly, plus 272MB of on-chip SRAM, which boosts performance when multiple users access a model simultaneously. With over 140 billion transistors under the hood, Maia 200 delivers more than 10 petaFLOPS in FP4 and over 5 petaFLOPS in FP8—both lower-precision formats that speed up AI tasks. 2

Microsoft directly compared its new chip against competitors this time. The Maia 200 reportedly offers roughly three times the FP4 performance of Amazon’s third-gen Trainium and surpasses Google’s seventh-gen TPU in FP8 performance. Microsoft also claimed a 30% boost in performance per dollar compared to the newest hardware in its own fleet.

Microsoft’s announcement focused heavily on software. They’re rolling out a Maia software development kit that works with PyTorch, the popular AI framework, and it bundles a Triton compiler plus a kernel library. Triton, an open-source project with significant input from OpenAI, is pitched as a different approach to the low-level optimizations developers typically handle using CUDA.

Scott Guthrie, executive vice president of Microsoft’s Cloud and AI division, claimed Maia 200 can “run today’s largest models” and still has capacity to scale. Microsoft plans to deploy Maia 200 to power OpenAI’s GPT-5.2 and other models within Microsoft Foundry and Microsoft 365 Copilot. 3

This chip positions Microsoft alongside Amazon and Google, both of which have been developing their own AI processors for cloud service clients. Nvidia is moving forward with its next “Vera Rubin” platform, while Microsoft’s chip relies on an older generation of high-bandwidth memory compared to what Nvidia plans for its upcoming models.

Silicon isn’t usually the toughest challenge. Developers have invested years in CUDA’s code and tools, so Microsoft must prove its Triton-based stack is dependable, speedy, and can smoothly handle large-scale workload migration. Plus, the rollout begins in select regions, meaning capacity—not just speed—will be under close scrutiny.

Microsoft announced that Maia 200 will back several models, including GPT-5.2. The Superintelligence team plans to leverage these chips for synthetic data generation and reinforcement learning while developing their next-gen models.

Technology News

  • Google Workspace adds Gemini AI to automate data entry with source citations
    March 12, 2026, 5:48 AM EDT. Google rolled out a new batch of Gemini-powered features across Docs, Sheets, Slides and Drive, aiming to automate routine work. Gemini will cite its sources after queries, with a sources tab showing where it drew flight confirmations and chats. In Sheets, users can describe tasks in plain language, skip exact formulas, and deploy an AI agent to fetch web data to fill cells, then summarize, categorize and chart results. You can chat with Gemini in Sheets to build custom reports. In Slides, natural-language prompts create slides and adjust layouts. Google also promotes personalized intelligence to tailor outputs to the user's needs. The updates position Google amid growing AI copilots while tying tools to users' files, emails and chats.

Latest Articles

LSEG Stock Price Rises 3% After Annual Report Shows CEO Pay Fell to £6.4 Million

LSEG Stock Price Rises 3% After Annual Report Shows CEO Pay Fell to £6.4 Million

March 12, 2026
London Stock Exchange Group shares rose 3% Thursday after its annual report set CEO David Schwimmer’s 2025 pay at £6.4 million, down from £7.86 million in 2024. The stock traded at 8,676 pence, outperforming a falling FTSE 100. LSEG reported 2025 adjusted EBITDA up 11.8% and announced £3 billion in new buybacks by 2027. The stock remains down 22.6% over the past year, lagging major exchange peers.
Aviva plc Share Price Today: Stock Slips as Direct Line Fine Tempers Buyback Support

Aviva plc Share Price Today: Stock Slips as Direct Line Fine Tempers Buyback Support

March 12, 2026
Aviva shares fell 0.4% to 625 pence after UK Insurance Ltd, now part of Aviva, was fined £10.6 million over past Solvency II reporting errors. The insurer bought 20,000 shares for cancellation under a resumed £350 million buyback. Aviva reported 2025 operating profit up 25% to £2.2 billion and raised its dividend 10%. Its Solvency II shareholder cover ratio dropped to 180% after the Direct Line acquisition.
IMI share price today: stock holds firm as £500 million buyback gathers pace

IMI share price today: stock holds firm as £500 million buyback gathers pace

March 12, 2026
IMI plc bought 108,100 shares for cancellation on March 11 at an average 2,755.47 pence each, part of a £500 million buyback begun this week. Shares held near 2,760 pence Thursday, little changed despite a weaker London market. The company expects the programme to run through 2026 and reported 2025 revenue of £2.3 billion. IMI stock remains up over 40% in the past year.