Five API gateway vendors refreshed product features over the last three months to reposition them as AI gateways, as enterprises redirect IT budgets toward AI initiatives.
In less than a year, market research indicates enterprise IT spending has shifted significantly into AI initiatives. In late 2023, just 4% of 670 IT professionals surveyed by TechTarget’s Enterprise Strategy Group had deployed generative AI tools into production. By January, a Gartner poll found that one in five organizations had generative AI in production, and two-thirds of respondents said they’re using generative AI in multiple business units, a 19% increase since September. A February Enterprise Strategy Group survey of 374 IT professionals found that 61% were making moderate to significant investments in AI.
This increased spending raises the stakes for AI governance, security and cost management, according to that February survey. In a question about terms survey takers associated with responsible AI, 48% of 327 respondents chose security, the leading answer. Respondents were allowed up to five answers to that question and cited reliability next (40%), then accuracy (37%). More than half (51%) said they plan to invest in responsible AI through “modern technology platforms, solutions and services,” followed by 45% who planned to invest in employee training.
AI governance concerns indicate a new phase of maturity in enterprise AI adoption, said Andrew Humphreys, an analyst at Gartner.
“The initial stage was, ‘Oh, we must do something with AI,’ and that led to a high failure rate because a lot of the projects were of low business value … a little development team trying to do something with AI,” he said. “It’s now become a challenge around, ‘How do I manage the cost of AI adoption across my organization?’ It’s a bigger problem.”
This year’s surge in enterprise AI spending is just the beginning of its growth, according to Humphreys, citing Gartner research. The analyst firm predicts that by 2027, spending on AI software will grow to $297.9 billion with a compound annual growth rate of 19.1%. Over the next five years, market growth will accelerate from 17.8% to reach 20.4% in 2027. Generative AI software spend will rise from 8% of AI software in 2023 to 35% by 2027.
GenAI puts new twist on API management
Cloud generative AI services such as Amazon Bedrock, Google Vertex AI and Microsoft Azure AI are often accessed via APIs, but those APIs differ in some ways from other apps, according to Humphreys.
“If all you’re really trying to do [are] things like rate limiting in terms of the API that you’re talking to, or tracking the number of calls that you’re making, [those] things … are actually API gateway functionality already,” Humphreys said. “But you can optimize [generative AI] better if you’re understanding AI engines. If you’re making a prompt, you can kind of optimize the questions … [to] consolidate calls … and detect sensitive data in the calls that you’re making.”
Thus, products such as IBM’s AI Gateway for API Connect and Solo.io’s Gloo AI Gateway support data masking, encryption and data exfiltration detection to prevent sensitive data from being sent to large language models (LLMs). These and other AI gateways that have shipped since May — including CloudFlare’s AI Gateway; a combined AI Gateway based on a partnership between F5 Nginx and Portkey; and Kong AI Gateway — have also added rate-limiting controls according to the number of AI tokens requested, rather than by number of API requests, as is common with other apps.
The most recently shipped AI gateway from Solo.io this month will perform AI API key management so that developers don’t create their own keys for third-party services then take them when they leave, potentially cutting off or compromising the organization’s access, according to Keith Babo, vice president of product management at Solo.io. Gloo AI Gateway will take on retrieval-augmented generation by brokering connections between apps, LLM services and vector databases.
Even if app teams are going full DevOps style, bringing [AI] all the way to production, eventually it falls on the platform team to own the availability and security and visibility into it.
Keith BaboVice president of product management, Solo.io
These features anticipate a growing role for platform engineers as AI apps move toward production, Babo said.
“Even if app teams are going full DevOps style, bringing [AI] all the way to production, eventually it falls on the platform team to own the availability and security and visibility into it,” Babo said.
In addition to these security and cost management controls, all these vendors claim their AI gateways can orchestrate the use of multiple LLM services in some way, whether it’s by caching LLM responses, consolidating similar LLM prompts before they’re sent, routing requests to the optimal LLM service or providing organizations comprehensive visibility into the usage and performance of AI services to fine-tune results.
Long-term, AI gateways will be “a part of the jigsaw puzzle” of addressing AI governance as growth continues, Humphreys said.
“It’s not like, ‘I bought an AI gateway and now all my problems are solved,'” he said. “You will see [continued] maturity around tooling.”
Beth Pariseau, senior news writer for TechTarget Editorial, is an award-winning veteran of IT journalism covering DevOps. Have a tip? Email her or reach out @PariseauTT.