TechWatch

Meta’s next-generation MITA Training and Inference AIP

MTIA complements and strengthens Meta’s AI infrastructure investments.

Jon Peddie
MITA 2 chip

Meta builds next-generation infrastructure for generative AI, recommendations, and research. The in-house MTIA program targets efficient execution of Meta’s models and complements GPUs. MTIA v2 doubles compute and memory bandwidth over v1, increases local storage, and raises on-chip SRAM and interconnect bandwidth. An 8×8 PE grid runs at 1.35 GHz and 90W, with PCIe Gen 5 and optional RDMA for scale. A PyTorch-aligned stack, including Triton-MTIA, compiles graphs and manages runtime. Rack  The MTIA 2 chip. (Source: Meta) Meta is building its next-generation infrastructure to support generative AI, recommendation systems, and advanced research. The company continues to expand investment as
...

Enjoy full access with a TechWatch subscription!

TechWatch is the front line of JPR information gathering service, comprising current stories of interest to the graphics industry spanning the core areas of graphics hardware and software, workstations, gaming, and design.

A subscription to TechWatch includes 4 hours of consulting time to be used over the course of the subscription.

Already a subscriber? Login below

This content is restricted

Subscribe to TechWatch