Fagen Wasanni Technologies August 14, 2023
Candice Clark

Amazon Web Services (AWS) is working on two custom chips, Trainium and Inferentia, to enhance the training and inference of large language models (LLMs) on its cloud platform. LLMs are a type of artificial intelligence that can generate text, translate languages, and provide detailed answers.

Trainium, developed for LLM training, comes with numerous cores and high-bandwidth memory to support the large datasets needed for training these models. On the other hand, Inferentia is an inference chip with fewer cores but greater energy efficiency. It allows LLMs to be deployed on various devices.

While other companies like NVIDIA and Microsoft are also creating custom chips for LLM training, Amazon’s chips possess the advantage of being designed for use on AWS’s cloud...

Today's Sponsors

LEK
ZeOmega

Today's Sponsor

LEK

 
Topics: AI (Artificial Intelligence), Cloud, Technology
FTC eyes Microsoft’s cloud practices amid antitrust scrutiny
Tech giants are investing in 'sovereign AI' to help Europe cut its dependence on the U.S.
5 Ways Healthcare Organizations Can Get the Most Out of a Cloud Assessment
Oracle seeks to address health disparities with new collaborative
Amazon vs. Microsoft cloud with Epic: 6 notes

Share This Article