Amazon Reveals Latest AI Model Training and Inferencing Chips

At the AWS re:Invent today, Amazon Web Services (AWS) announced the release of two next-generation chips—AWS Graviton4 and AWS Trainium2.

AWS Graviton4 and AWS Trainium2 prototype

Amazon’s new cutting-edge chips promise enhanced price performance and energy efficiency across various customer workloads, including machine learning (ML) training and generative AI apps.

The chips complement existing chip/instance combinations from third parties like AMD, Intel, and NVIDIA, enabling customers to run diverse applications on Amazon Elastic Compute Cloud (Amazon EC2).

The Graviton4 chip boasts up to 30% better compute performance, 50% more cores, and 75% more memory bandwidth than its predecessor, the Graviton3.

This enhancement delivers optimal price performance and energy efficiency across a wide spectrum of workloads on Amazon EC2.

Meanwhile, the Trainium2 chip targets up to 4x faster training than its prior version.

ts deployment in EC2 UltraClusters, accommodating up to 100,000 chips, accelerates training for foundation models (FMs) and large language models (LLMs).

Additionally, Trainium2 significantly improves energy efficiency by up to 2x.

AWS png

AWS presently offers over 150 Graviton-powered Amazon EC2 instance types globally, serving more than 50,000 customers, including renowned names like Datadog, DirecTV, Discovery, Formula 1 (F1), and many more.

These customers utilize Graviton-based instances across various workloads, from databases and analytics to web servers and ad serving, benefiting from improved price performance.

Graviton4 processors, available in memory-optimized Amazon EC2 R8g instances, elevate performance with 30% better compute, 50% more cores, and 75% more memory bandwidth compared to Graviton3.

The Graviton4-powered R8g instances are accessible in preview mode, with general availability expected in the forthcoming months

P.S. - Like our news? Support the site with a coffee/beer. Or shop with our Amazon link. We use affiliate links when possible--thank you for supporting independent media.