Future AI hardware advances could drive materials science R&D

[Adobe Stock]

For all its promise, artificial intelligence is still stuck in something of an adolescent phase. The tech has shown brilliance in defined tasks, but lacks maturity.

While generative AI software such as GPT-4 has captured the public’s imagination, the hardware needed to unlock AI’s true potential is lagging. “I think the hardware is not there yet,” said Pradyumna (Prady) Gupta, Ph.D., founder and chief scientist at Infinita Lab, which provides testing services for R&D initiatives. While the latest crop of graphic processing units (GPUs) enables the creation of trillion-parameter-scale deep learning models, Gupta believes that the memory component of hardware continues to be “the real bottleneck” in AI.

Breaking through the wall that limits AI’s potential

One name for the phenomenon is the “memory wall” in computer architecture. “The trend of placing more and more cores on chip exacerbates the situation, since each core enjoys a relatively narrower channel to shared memory resources,” explain Sally A. McKee and Robert W. Wisniewski in the Encyclopedia of Parallel Computing.

Pradyumna (Prady) Gupta

Pradyumna (Prady) Gupta

As AI models grow more complex, they require massive amounts of data processing and consequently lower latency, higher bandwidth memory. “Obviously, there are better logic chips coming up, but the memory is still the same old memory,” Gupta said. “There are new [memory] technologies coming up, and when they arrive, I think it will be a real revolution.”

Growing AI interest could spur demand for next-gen memory. Gartner predicts that demand for high-bandwidth memory will surge eightfold, from 2022 to 2027

Pushing the limits of hardware

“The way you think is how a lot of people, even in the industry making chips, are thinking. They believe that this might be the end, which is surprising,” Gupta said, referring to the observation that the number of transistors on a microchip tends to double roughly every eighteen months to two years, known as Moore’s Law. “Companies like TSMC, which is the most advanced in terms of technology, are making chips that go down to three nanometers. The A14 chip in the iPhone 12 and the A15 chip in the iPhone 13, for instance, are manufactured using TSMC’s 5nm process.

“TSMC has a roadmap to surpass one nanometer and even go below that. Intel also has plans for sub-nanometer technology,” Gupta said. “When they reach that level, they’ll be playing with individual atoms.”

Gupta underscored the significance of this progress: “Ten years ago, no engineer in the semiconductor industry, not even the people making the chips, would have thought that we’d come this close to the atomic level, like the three-nanometer technology we have now.”

A unique perspective

As a former senior scientist at Corning who helped develop the Gorilla Glass used in the first smartphones, Gupta has a unique perspective on the intersection of materials science and technology. “If you think about glass, it breaks when you drop it, right?” he said. But the goal of Gorilla Glass, a product that continues to evolve across generations, is to engineer a glass that can survive a drop. “You’re making something out of glass which is very different from its fundamental nature,” Gupta said.

In the world of material science and deep tech, the R&D process can pose a unique set of challenges compared to software development. “In the case of material science, pharma, or any field involving physical sciences where you have to create something, it’s much different,” Gupta said. “You follow the same loop: you think about something, design an experiment and make a physical sample. Many things can go wrong. You take something, and the real world gives you something else. It’s not a computer program; it’s a real thing.”

Democratizing access to testing resources

But even as computing power continues to advance, Gupta argues that, in addition to memory limitations, access to cutting-edge testing equipment remains a major bottleneck for innovation. That’s where his company, Infinita Lab, comes in.

“We have a network of about 2,000 labs across the US, and that number is increasing,” he said. “What we’re doing is democratizing access to an almost unlimited number of material testing services at a very low cost,” Gupta explained. “The issue we solve is that a startup or individual engineer often needs to do certain tests, but either doesn’t have the equipment in-house, doesn’t know where to go to get the test done, or the place they need is inaccessible, like a university lab that’s hard to access from outside.”

Infinita Lab provides a variety of lab testing services to evaluate materials, products and processes. For instance, it provides metrology testing using microscopy and spectroscopy, materials testing to assess properties of various substances like metals and polymers, and product testing under normal and stressed conditions.

To offer such variety, it partners with labs across the U.S. “You could say [the model is] like Amazon or Uber,” he said. That is, the company makes a product or service available, but doesn’t necessarily own the inventory.

Shaping the future world

Returning to the theme of artificial intelligence, Gupta sees significant potential of the technology in materials science. “There are a lot of unknown things in this process that you can potentially run through AI,” he said. “AI could have a significant impact on every part of the process, from designing experiments to analyzing data and finding out whether a particular sample preparation or composition should be tried out. All these questions would be answered in a much more efficient and optimal way by AI, which would propel R&D at a much higher velocity.”

Originally Appeared Here

Author: Rayne Chancer