Meta AI Chips Roadmap Through 2027
Meta AI chips roadmap through 2027 expands in-house MTIA capacity and could prompt investors to reweight vendor exposure and capital spending plans.

KEY TAKEAWAYS
- Laid out MTIA 300, 400, 450, 500 generations, with MTIA 300 deployed weeks earlier.
- Plans new generations roughly every six months, concluding by end-2027.
- Aimed to expand data-center capacity and reduce reliance on third-party silicon vendors.
HIGH POTENTIAL TRADES SENT DIRECTLY TO YOUR INBOX
Add your email to receive our free daily newsletter. No spam, unsubscribe anytime.
Meta Platforms (META) unveiled on March 11, 2026, a roadmap for four Meta AI chips in the Meta Training and Inference Accelerator (MTIA) family through the end of 2027. The chips aim to support growing AI workloads, expand data-center capacity, and reduce reliance on third-party silicon.
Roadmap and Timeline
The MTIA family now includes MTIA 300, MTIA 400, MTIA 450, and MTIA 500. Meta deployed the MTIA 300 a few weeks before this announcement. The designs handle both AI training and inference workloads. Meta first revealed the MTIA program in 2023 and released a second generation in 2024. The company plans to introduce new generations roughly every six months following MTIA 300.
Data Centers and Supply Chain
Meta’s in-house AI chips are intended to expand its data-center capacity while reducing dependence on external accelerators from vendors like Nvidia and AMD. The recent coverage did not disclose any performance, capacity, or cost-savings metrics for the new chips. No regulatory approvals, antitrust inquiries, or export-control actions related to the program were reported. Multiple secondary sources provided consistent details, but no company SEC filings, press releases, or official transcripts appeared in the 72-hour reporting window. This shift toward internalizing accelerators signals a multi-year change in Meta’s capacity planning and supplier relationships, with implications for capital spending and vendor contracts.





