Amazon (AMZN) released a new AI chip on Tuesday, marking the latest move by a tech giant to challenge leading chipmaker Nvidia (NVDA) by introducing custom chips that can handle some AI tasks at lower prices.
Amazon said its servers equipped with Trainium3 chips are four times faster and more energy efficient than those with its previous-generation chips.
” Trainium already represents a multibillion-dollar business today and continues to grow really rapidly,” Amazon Web Services CEO Matt Garman said during the tech giant’s annual event, called re:Invent, Tuesday.
The tech giant is one of several Nvidia customers to develop their own AI semiconductors in what is likely the largest competitive threat for the leading chipmaker.
Google (GOOG) unveiled its latest AI chip, the seventh-generation Ironwood TPU (tensor processing unit), in early November, and the company is reportedly in talks to supply Meta (META) with billions of dollars’ worth of its TPUs in addition to a recent multibillion-dollar deal with Anthropic (ANTH.PVT). Meanwhile, Microsoft (MSFT) is aiming to eventually rely on its own custom chips rather than Nvidia’s, though it has faced delays in developing in-house silicon.
“Diversity of chips in the AI market is a good thing,” AWS vice president of compute and machine learning Dave Brown told Yahoo Finance in an interview.
Amazon has been aggressively scaling up its custom AI hardware over the past year. The company recently completed a massive AI data center project, dubbed Project Rainier. AI developer and OpenAI rival Anthropic is set to use 1 million of Amazon’s custom chips from that project and its other data centers by the end of 2025. Brown said Anthropic has worked closely with Amazon to develop its AI chips.
Amazon’s chips are much cheaper than Nvidia’s: AI developers using Amazon’s chips can see cost savings of 30% to 40%, according to Brown.
”That’s what our customers are looking for, is to constantly get more compute and more performance, and then super importantly, at a lower price,” he said.
Nvidia CEO Jensen Huang has suggested that customers would prefer the dominant chipmaker’s GPUs (graphics processing units) even if alternatives were free because they are so much more powerful than rival chips. AI developers also typically choose Nvidia’s chips because of the software stack that comes with them.
Amazon itself is a large customer of Nvidia. More than 10% of Amazon’s capital expenditures go toward Nvidia’s products, and the company accounts for 7.5% of the chipmaker’s revenue, according to Bloomberg data.
OpenAI recently inked a $38 billion deal with Amazon to use Nvidia’s chips through its cloud platform.
When asked about OpenAI transitioning to use Amazon’s chips, Brown left open the possibility, speaking broadly of AWS customers: “They may say, ‘Hey, we’re going to stay on Nvidia.’ But if they can see a meaningful price performance benefit … then normally we see that they’ll definitely work on moving [to Amazon’s chips].”
Amazon also said Tuesday it’s working on developing its next-generation Trainium4 AI chips that, notably, will be designed for compatibility with Nvidia’s networking technology, NVLink Fusion. The coveted tech connects chips within AI server racks.
Amazon Trainium 3 chip. (Photo: Amazon)
Laura Bratton is a reporter for Yahoo Finance. Follow her on Bluesky @laurabratton.bsky.social. Email her at laura.bratton@yahooinc.com.
for the latest technology news that will impact the stock market
Read the latest financial and business news from Yahoo Finance






