How AI Expansion Helps Companies Like RunPod Find New Niches to Make Money

How AI Expansion Helps Companies Like RunPod Find New Niches to Make Money

Many of the chatbots and other smart AI systems now in the spotlight require arrays of powerful chips to run, which is why they tend not to live on your smartphones or laptops, but in the cloud. That’s another way to describe huge, network-connected warehouses of server computers far away. But existing cloud infrastructure isn’t necessarily optimized for this sort of new use, a problem that small New Jersey startup RunPod is tackling.

RunPod is all about building deliberately AI-centric cloud tech — essentially the right kind of processor chips and developer-supporting tools for building AI systems. This could appeal to developers and third-party companies interested in building and testing their own custom-designed AI services in the cloud.

The company just announced that it has raised $20 million in seed funding in a round led by Intel Capital and Dell Technologies Capital. Both those names are tech-industry standards and lend support to the notion that RunPod really is on to something here. RunPod’s press release describes itself as the “first cloud provider for AI applications” and a “launchpad that empowers developers to deploy custom full-stack AI applications.” Full-stack usually means an entire software package or application: The front part is the client-facing piece, and the back part is the technical, business logic section that makes it work.

The trick is that off-the-shelf cloud systems, like those offered by Amazon via its Amazon Web Services system, are often set up to be general purpose machines. This makes sense, since developers and third-party companies that pay for computer time on these cloud facilities use them for a huge array of different purposes, from running databases to processing complicated code. But when you’re using a cloud-connected AI system, it can have technical effects like latency — the delay between entering a query and receiving a response — caused by the design inefficiencies that make a general-purpose cloud service inadequate for specific AI needs. Developers who run AI apps on cloud systems would also find their work easier if, as part of the service, there were premade AI tools that simplified part of the process of building an AI app.

So what RunPod offers is a special cloud system carefully designed for third-party companies to rent to build, test, and run their own custom AI models. The company’s press release says that with just a few clicks, its customers can set up a model on its system and the amount of computer power that they need can be quickly adjusted if more is needed to run the AI. The company says it’ll use its brand-new funding to boost its services, enhancing the “day-to-day life of developers” who use it.

Reporting on the RunPod news, website VentureBeat notes that these sorts of AI-dedicated cloud companies are popping up and attracting lots of funding in parallel to the expansion of AI’s power and greater awareness of the tech. For example, Together Computer, based in San Francisco, is seeking to raise funds of over $100 million to help it expand its AI cloud infrastructure. This company uses the high-end Nvidia Graphics Processor Units that are driving much of the AI revolution, and which have recently made chipmaker Nvidia a lot of money — the company’s newly-revealed AI GPU is set to cost $30,000 to $40,000 each.

That a small company like RunPod can make headway in the AI tech world is another example supporting the idea that entrepreneurs need only find a tiny niche in a new market, and solve a problem to make the most of that niche — and make money. It’s also a small business counterexample to many of the big AI tech players, which are planning investments to the tune of hundreds of billions of dollars to develop next-generation AIs.

Originally Appeared Here