Adobe Stock
Generative artificial intelligence (AI), a term that would have been considered abstract 18 months ago, is ever present today. From driving the world’s fastest adoption of a consumer app to creating a tech stock market boom not seen since the days of the dot-com bubble, generative AI has emerged as the foremost technology trend of our era. But where are we on this journey from hype to value realization and who are the players that are likely to be the most meaningful contributors to this space over the next decade?
Before we address those questions, let’s take a step back and discuss the origins of generative AI and some of the prerequisites to consider when developing a generative AI strategy. Generative AI, like all forms of machine learning, requires access to and the ability to process vast amounts of data. Put simply, having your data in the cloud, particularly in the public cloud, is a prerequisite for meaningful generative AI workloads. One trend we have seen at Amazon Web Services (AWS) in recent months is that the value creation potential of generative AI is creating an accelerating pull factor for that late majority of organizations that have yet to move their data and workloads to the cloud.
Another key theme of the generative AI explosion is that it will be a durable presence in our lives for decades to come, and while recent developments may seem like an overnight phenomenon, they are part of a longer-term journey. To quote our CEO, Matt Garman, “I don’t think anyone can tell you what things are going to look like in the generative AI space 12 months from now. But what I can say is, together with our partners, we’re making sure customers can take advantage of all the new technologies that will be coming down the line.” As we always say at AWS, it’s easier to invent the future than predict it.
Amazon’s Core Principles
It’s important to say that machine learning and AI have been a core part of Amazon and AWS from our early days. When we consider how we help customers harness the power of the cloud, we do not stray from our core principles: providing customers with lower prices, greater choice and increasing convenience—all in a secure, controlled environment.
In the context of generative AI, to help customers drive costs down, AWS has invested in our own custom silicon, allowing data scientists and machine learning practitioners access to the most price-performant infrastructure to run generative AI workloads at scale in the cloud. To give customers choice, we have developed our own models, including Amazon Titan, and worked with leading open-source models, including AI21 Labs, Anthropic, Cohere, Meta, Stability AI, and most recently, Mistral AI. Finally, to help customers have greater convenience, AWS launched Amazon Bedrock, a fully managed service that offers a choice of high-performing foundation models, giving customers access to both open-source and proprietary models in a controlled and safe environment.
Since the start of 2024, we have introduced more features on Amazon Bedrock than all the other major cloud providers combined. This includes the addition of 11 industry-leading models from Anthropic, Meta, and Mistral, including the Anthropic Claude 3.5 Sonnet model, which is already being used by thousands of customers.
Underpinning all of this, we are working to democratize access to these technologies for everyone. Last year, we launched Amazon Q, a new type of generative AI-powered assistant that can be tailored to your business by harnessing data and expertise from your company’s enterprise systems. Amazon Q assists developers with all of their tasks—from code documentation, coding, testing, and upgrading applications, to troubleshooting, performing security scanning and fixes—allowing developers to focus on building software.
Adobe Stock
Customers in Asia are using Amazon Q to improve operational efficiency and achieve positive business outcomes. Online trading platform Deriv implemented Amazon Q for internal document exploration, significantly reducing new employee onboarding time and improving recruiting outcomes. Meanwhile, Bolttech, a Singapore-based insurtech, revolutionized their development workflows with Amazon Q Developer, cutting down the time spent on updating code documentation by 75%.
AWS has also announced a range of tools to put the power of generative AI in the hands of every employee. For example, Amazon Q Apps can help an HR professional with no coding experience create personalized onboarding plans for new employees in seconds, while AWS App Studio, a generative AI-powered, low-code application building service allows technical employees, such as IT project managers and data engineers, to create secure applications in minutes, rather than days or weeks.
Our Commitment To A People-Centric Approach
At AWS, we are focused on ensuring that the value proposition associated with generative AI is truly democratized and localized across the globe. A key dependency of this is language—specifically, the need to invest in multilingual large language models (LLMs) catering to over 95% of the world’s population who do not speak English as their primary language. In Southeast Asia, where less than 1% of the population are native English speakers, AI Singapore is making its LLMs more culturally accurate, localized, and tailored to the region. Building on AWS’s scalable compute infrastructure, AISG developed SEA-LION, a family of LLMs that is specifically pre-trained and instruct-tuned for more commonly used languages, including Bahasa Indonesia, Bahasa Melayu, Thai, and Vietnamese, and will eventually be extended to include other Southeast Asian languages like Burmese and Lao.
This move toward more culturally aware AI is also reflected by Malaysia’s CelcomDigi, who are also working with us to co-create generative AI solutions and applications to enhance user experience. Recognizing that LLMs today are trained predominantly on English, CelcomDigi will leverage Amazon Bedrock LLMs to develop and implement deep learning algorithms capable of processing natural language tasks, such as a Bahasa Melayu language chatbot, to communicate with its linguistically and culturally diverse customer base.
Adobe Stock
Many CEOs I speak to often wonder how to translate the hype surrounding generative AI into tangible business value. Over the past year, we have seen an inflection in the volume of experimentation as organizations grapple with how to apply this technology within their own contextual domain and environment. At AWS, we are now seeing this experimentation materially pivot to production use cases across a diverse range of domains, from developer productivity to customer experience to enterprise search. The value created is truly astonishing across industries, from media to healthcare. For example, FOX Corporation delivers AI-driven products in near real-time that are contextually relevant to consumers, advertisers and broadcasters. Meanwhile, the National University Health System (NUHS) in Singapore is experimenting with Amazon Bedrock to develop a solution to automate patient discharge summaries, enabling NUHS clinicians to focus more on their consultations with patients.
Unlocking The Value Of Generative AI
Generative AI is one of the most transformational technologies of our generation, tackling some of humanity’s most challenging problems, augmenting human performance, and maximizing productivity. As responsible AI and security are our top priorities, we will continue to launch new tools, and partnerships to improve the safety, security, and transparency of our AI services and models, and make it easier for Amazon businesses and AWS customers to build generative AI responsibly.
As part of this effort, AWS Singapore has launched the AWS Responsible Generative AI Community to help customers and partners harness the power of AI securely and responsibly by offering practical solutions through workshops and collaborations with Singapore’s Infocomm Media Development Authority and the AI Verify Foundation.
Furthermore, we enable our customers to adopt generative AI quickly and securely. For instance, Temus, a digital transformation services firm in Singapore, was able to adopt Amazon Q safely and quickly due to the tool’s built-in privacy and security features. By leveraging Amazon Q’s capabilities for code generation, code scanning capabilities, and efficient bug-fixing protocols, Temus has boosted its development efficiency by an average of 35%. Customers are also able to prevent harmful content and manage sensitive information within an application with Guardrails for Amazon Bedrock.
We are still in the early days of what can be accomplished with generative AI, and there’s so much more that customers will achieve. This immense potential is driving our commitment to making it easy, practical, secure, and cost-effective for customers of all sizes to adopt generative AI.