“What Nvidia’s really good at is creating new markets,” Nvidia CEO Jensen Huang told Goldman Sachs CEO David Solomon at a technology conference hosted by the investment bank Wednesday. Nvidia investors and AI stakeholders continue to hang onto Huang’s every word since Nvidia’s less-than-blockbuster (though only compared to previous blowouts) earnings report in August.
Though Nvidia’s chips may seem ubiquitous today, Huang said the company had to spread the gospel of graphics processing unit-driven computing “one industry at a time.” That part is clear to investors, as Nvidia’s eye-watering revenues and total AI infrastructure spending estimates surpassing $1 trillion offer ample evidence.
Solomon asked the most important question in tech today: Where is the return on investment from all this investment in AI infrastructure?
Huang has addressed similar questions before, but on Wednesday he offered more math in a two-fold response.
First, in the age of generative AI, Huang said, cloud providers who buy Nvidia (and other) GPUs and rent them to tech companies, make $5 for every $1 they spend.
“Everything is all sold out. And so the demand for this is just incredible,” he continued.
Second, for the customers of those cloud providers, who essentially rent computing time on GPUs, Huang said if companies convert traditional data processing work to accelerated computing methods, the incremental cost may indeed double, but the job would be done 20 times faster. “So you get a 10x savings,” Huang said, adding, “It’s not unusual to see this ROI.”
Datacenter of the future
Huang’s overarching answer to the question of when AI’s ROI will be evident urges companies to “accelerate everything.”
“Any large-scale processing of any large amounts of data — you’ve got to accelerate that,” he said.
Huang argued that upgrading existing data centers to “accelerated computing,” or the parallel computing that Nvidia GPUs and other AI chips enable, is inevitable.
With price tags for some models reaching into the millions, Nvidia server racks sound expensive, Huang said, but they replace “thousands of nodes” of traditional computing. Smaller, more densely packed, liquid-cooled data centers are the future, he said.
“Densified” data centers will be more energy and cost-efficient, Huang said, adding “That’s the next ten years.”
Nvidia’s stock price rallied less than an hour after Huang’s conversation with Solomon began.
Even if Nvidia’s claims that AI computing is more energy efficient are true, the AI boom is expected to put massive pressure on electricity grids.
Huang acknowledged that the stakes are high for all involved in the AI boom.
“Demand is so great that delivery of our components and our technology and our infrastructure and software is really emotional for people because it directly affects their revenues,” Huang said.