Meta & Arm Forge the 3nm AGI Brain: A Chip to Power AI Racks Like a Crypto Mining Rig
Meta dropped the news Tuesday that it's teaming up with Arm to forge a new breed of CPUs, designed to handle AI workloads and general compute across its ever-growing data-center sprawl. Dubbed the Arm AGI CPU, this first piece of silicon is being pitched as a more efficient, less power-hungry alternative to the legacy server processors currently sweating in AI infrastructure—think of it as swapping out a gas-guzzler for an electric truck in the middle of a cross-country haul.
The social media giant claims this chip will crank up the performance per rack and support gigawatt-scale AI deployments, which is a cornerstone of its relentless march toward more advanced, and presumably more opinionated, AI systems. The AGI CPU will cozy up next to Meta’s own custom MTIA accelerators, adding yet another specialized tool to its diversified hardware stack for the twin rituals of AI training and inference.
This hardware hustle is part of a recent infrastructure blitz: back in February, Meta inked a long-term deal with AMD for up to 6 GW of Instinct GPUs, and just this month, Reuters reported a roadmap for four new in-house AI chips as the company goes full degen on scaling its data centers.
According to that Reuters report, the AGI CPU marks Arm’s first major foray into building its own data-center chip, a significant pivot from its usual business model of just licensing blueprints and collecting royalties. Meta is the lead design partner on this project, TSMC will be the master fabricator on its cutting-edge 3-nanometer process node, and volume production is scheduled to commence in the second half of 2026—so mark your calendars, future is coming, but not immediately.
Arm is marketing the AGI CPU for the so-called “agentic AI” era, where the CPU acts as the grand orchestrator, managing accelerators, memory, storage, networking, and distributed AI tasks. In a reference air-cooled rack, a standard config can hold 30 blades and deliver 8,160 cores; a liquid-cooled Supermicro design can push that to over 45,000 cores per rack. Arm boldly claims the chip can deliver more than double the performance per rack of current x86 systems, which could potentially save up to $10 billion in capital expenditure for every gigawatt of AI data-center capacity—numbers so big they'd make a memecoin founder blush.
This brainy CPU won’t be exclusive to Meta’s walled garden. Launch partners already lined up include OpenAI, Cloudflare, SAP, SK Telecom, Cerebras, and others. In a move that feels almost altruistic for a tech giant, Meta also plans to open-source its board and rack designs for the CPU through the Open Compute Project later this year, a strategy that could turbocharge adoption among data-center builders faster than a viral token launch.
At the time of the announcement, the market reaction was about as enthusiastic as a trader watching a sideways chart: Meta shares were trading around $595.20, down 1.5% on the day, while Arm shares hovered near $135.20, down 1.2%.
Share Article
Quick Info
Disclaimer: This content is for information and entertainment purposes only. It does not constitute financial, investment, legal, or tax advice. Always do your own research and consult with qualified professionals before making any financial decisions.
See our Terms of Service, Privacy Policy, and Editorial Policy.