Arm Breaks 40-Year Licensing Model, Enters Chip Manufacturing With Meta as Anchor Client
The UK semiconductor designer's first in-house CPU marks a strategic pivot that could reshape competitive dynamics across the AI chip supply chain.

Arm Holdings has unveiled its first internally manufactured chip after four decades of licensing designs to others, a strategic shift that positions the UK-based semiconductor architect as both supplier and competitor to its traditional customer base.
The company announced Tuesday that its AGI CPU—a data center processor optimized for AI inference workloads—will enter volume production in the second half of 2026, with Meta Platforms serving as lead partner and co-developer. The chip, fabricated by Taiwan Semiconductor Manufacturing Co on 3-nanometer technology, features up to 136 cores per CPU and supports 64 CPUs per air-cooled server rack. Arm expects the product line to add billions of dollars in annual revenue.
Meta's role extends beyond first customer. The social media giant co-designed the chip and plans to collaborate on "multiple generations" of the data center CPUs, according to announcements from both companies. Other customers lined up for the AGI CPU include OpenAI, Cloudflare, SAP, SK Telecom, and Cerebras. Arm cloud AI head Mohamed Awad told CNBC the company aims to serve firms that cannot afford in-house processor development.
"It's back, and it works, and it's doing everything we thought it would," Arm executive Haas said, referring to test chips that have met performance expectations.
The move comes as cloud service providers increasingly develop custom ASICs internally. TrendForce forecasts that ASIC-based AI servers will represent 27.8 percent of all AI server shipments in 2026, rising to nearly 40 percent by 2030. Arm's entry into manufacturing appears designed to capture revenue from companies unable or unwilling to design proprietary silicon, even as hyperscalers like Google and Amazon expand internal chip programs.
(Arm is currently owned by SoftBank. Financial terms of the Meta partnership were not disclosed, nor was the volume of chips Meta plans to deploy. The AGI CPU will be overseen by Mohamed Awad, head of Arm's cloud AI business, with additional designs planned at 12- to 18-month intervals.)
The announcement notably excluded Qualcomm, which claimed "complete victory" over Arm in a court ruling last fall concerning licensing agreement terms. Major Arm customers including Amazon AWS, Microsoft, Google, Marvell, Nvidia, and Samsung issued congratulatory statements, but Qualcomm's absence underscores lingering tensions. Arm's shift from pure licensing to direct manufacturing raises questions about whether the company will eventually compete for market share against the same firms that have historically paid for its designs.
Arm claims the AGI CPU delivers twice the performance per watt of traditional x86 CPUs while reducing memory bottlenecks, leveraging the architecture's longstanding efficiency advantages. The chip runs on the Neoverse platform used by AWS Graviton, Nvidia Vera, and Microsoft AI processors. Arm is also partnering with server manufacturers including Lenovo and Quanta Computer to offer complete systems. Wall Street expects Arm to generate $4.91 billion in revenue for its current fiscal year, according to LSEG estimates.
Keywords
Sources
https://www.reuters.com/business/media-telecom/arm-unveils-new-ai-chip-expects-it-add-billions-annual-revenue-2026-03-24/
Focuses on revenue expectations and strategic shift from licensing-only model, with technical specifications and production timeline.
https://www.theverge.com/ai-artificial-intelligence/899823/arm-agi-cpu-meta
Emphasizes Meta's co-development role and Qualcomm's conspicuous absence from congratulatory statements amid ongoing legal tensions.
https://gizmodo.com/arm-lends-a-hand-launches-in-house-ai-chip-with-meta-as-its-first-customer-2000737555
Questions whether Arm's manufacturing pivot will eventually threaten customers' market share as company moves from ancillary to competitive role.
https://www.eetasia.com/trendforce-as-csps-increasingly-self-develop-asics-nvidia-boosts-portfolio-for-ai-training-inference/
Provides market context showing CSPs' growing ASIC development, with ASIC-based AI servers forecast to reach 40% of shipments by 2030.
