Nvidia Unveils Language Processing Unit in $20 Billion Groq Acquisition Debut
Chipmaker introduces first product from its largest-ever deal as memory supplier Micron's stock falls despite tripling revenue, signaling investor caution on AI infrastructure bets.

Nvidia has introduced its first chip architecture derived from the $20 billion acquisition of startup Groq, marking the company's largest purchase to date and a strategic expansion beyond its dominant graphics processing unit franchise.
The new chip, branded as a Language Processing Unit or LPU, represents a fundamentally different design philosophy from Nvidia's signature GPU architecture. While GPUs deploy thousands of cores operating in parallel, the Groq 3 LPU employs a single-core design optimized to accelerate GPU performance, according to details disclosed during CEO Jensen Huang's keynote address on Monday.
The December acquisition of Groq brought Nvidia technology specifically engineered for inference workloads—the phase where trained AI models generate responses—rather than the training phase that has driven the company's recent growth. The LPU architecture aims to address bottlenecks in language model deployment that multi-core GPU designs were not originally built to solve.
The announcement comes as investor enthusiasm for AI infrastructure spending shows signs of cooling. Memory chipmaker Micron reported tripling its revenue and exceeding analyst forecasts in quarterly results released Wednesday, yet its stock declined in a pattern that echoed the muted response to Nvidia's own better-than-expected February earnings.
(The Groq deal closed in December and represents Nvidia's most significant move to diversify its product portfolio beyond training-focused GPUs. Separately, private equity firm Bain Capital has begun soliciting buyers for its stake in Bridge Data Centres as demand for AI infrastructure assets remains elevated.)
Nvidia's expansion into specialized inference chips reflects mounting competitive pressure as hyperscale cloud providers including Amazon, Google, and Microsoft develop proprietary silicon to reduce dependence on external suppliers. The LPU introduction signals Nvidia's intent to defend market share across the full AI compute stack rather than cede the inference segment to rivals or in-house alternatives.
Meanwhile, Chinese technology conglomerate Alibaba reported a 66 percent decline in net income for its December quarter, missing revenue expectations as Beijing's regulatory environment and macroeconomic headwinds continue to weigh on the country's tech sector.
Keywords
Sources
https://www.cnbc.com/2026/03/20/nvidia-gtc-2026-agentic-ai-chips-tech-download.html
Focuses on LPU architecture details and contrasts single-core design with traditional multi-core GPU approach
