Author Topic: "Amazon Developing Dual Custom AI Chips for AWS Language Model Training"  (Read 275 times)

Md. Abdur Rahim

  • Full Member
  • ***
  • Posts: 167
    • View Profile
By John Callaham,



For the past several months, companies like Microsoft, Google, OpenAI, and NVIDIA have been getting headlines for their efforts to advance generative AI hardware and software services. One major tech company, Amazon, is trying to get in on the AI conversation as well.

CNBC reports that its Amazon Web Services division has been working on two custom chips, Inferentia and Trainium, that it hopes will rival those made by NVIDIA with its Grace Hopper superchips for training large language models. NVIDIA just announced its next-gen Grace Hopper platform which should be available in 2024.

AWS is no stranger to making custom chips. It started 10 years ago with Nitro, and Amazon says there's now at least one Nitro chip in every one of its AWS servers.


For the past several months, companies like Microsoft, Google, OpenAI, and NVIDIA have been getting headlines for their efforts to advance generative AI hardware and software services. One major tech company, Amazon, is trying to get in on the AI conversation as well.

CNBC reports that its Amazon Web Services division has been working on two custom chips, Inferentia and Trainium, that it hopes will rival those made by NVIDIA with its Grace Hopper superchips for training large language models. NVIDIA just announced its next-gen Grace Hopper platform which should be available in 2024.

AWS is no stranger to making custom chips. It started 10 years ago with Nitro, and Amazon says there's now at least one Nitro chip in every one of its AWS servers.


While Amazon is trying to make its own chips for training LLMs, it also using some NVIDIA chips for the same purpose in its AWS servers. In July, it announced Amazon EC2 P5 instances were available for AWS users. These servers are powered by NVIDIA H100 Tensor Core GPUs.

Amazon stated to CNBC that “over 100,000” of its customers were using AWS for machine learning. While that's just a fraction of the company's overall AWS customer base, more and more of them could start using Amazon's solution as generative AI expands to more industries.

Source:  Neowin LLC.
Original Content: https://shorturl.at/bmsy4