After years of discussion, and seemingly little progress, the sudden boom of AI caught the majority of the population by surprise.
Swami Sivasubramanian, vice president of database, analytics, and machine learning (ML) at Amazon Web Services (AWS), believes this explosion of interest is due to the technology finally reaching its “tipping point.” AI and ML models are now at a level of sophistication and functionality at sufficient scale for value to be generated from their use.
“Generative AI has captured our imagination,” Sivasubramanian says. “I believe it will transform every application, industry and business.”
Key to this advancement is the breakthrough in the variety of data that models can consume and the multiple outputs they can produce. Previously, ML algorithms would require highly specific or ‘labelled’ data to produce a defined output.
Now, foundational models (FMs) are trained using a wider range of information, and can produce a variety of complex outputs such as video or imagery. Crucially, FMs can be tuned to fit the requirements of an application or use case, reducing the time to market for new products or functionality.
“We can leverage massive amounts of complex data, to capture and present knowledge in more advanced ways, mapping complicated inputs to complicated outputs,” says Sivasubramanian. “What used to take months of scientists developing a machine learning model for one task can now be done relatively easily with one model and fine tuning.”
Every business sector can reap the benefits of these advancements. For example, an ability to generate video, audio and written content at speed can help enhance worker efficiency across many functions of an organisation. And customer experience could be improved through higher-quality chatbot interactions.
Generative AI could have truly transformative capabilities in accelerating time to market in key research and development sectors, including health and life services and transportation.
“The potential is huge,” says Phil Le Brun, enterprise strategist and evangelist at AWS. “Imagine pharmaceutical companies accelerating the design of gene therapies, borrowers having rich conversational experiences with mortgage providers that quickly approve their loans, or everyone everywhere gaining opportunities through broadening access to ongoing knowledge and educational pathways.”
AWS is helping its clients access these new capabilities through Amazon Bedrock, a cloud-based fully managed service that offers a choice of high-performing FMs that offers access to generative AI applications through a single API.
Amazon Bedrock features universally helpful applications such as fielding customer complaints through ‘AI agents’, leveraging existing company data without any manual input from human staff. Through Amazon QuickSight, data can be collated from across organisations, and generated into business intelligence reports from written language prompts, rapidly accelerating decision-making. For medical clients, AWS HealthScribe can automate paperwork through speech input, allowing doctors and nurses to spend more time treating their patients.
Accessing these services through a single API offers massive potential for companies to work more efficiently, and produce innovative applications of their own at speed.
To make the most of these innovations, organisations must take some important steps. Sivasubramanian said companies must first ensure they have access to best-in-class FMs to leverage their apps and to build their own company’s capability. To do so, they will need a secure and private cloud-based environment, where they can customise models using their own data.
This should then allow companies to take the heavy lift away from staff, creating easy-to-use generative-AI-based tools that increase productivity. Finally, a machine-learning optimised infrastructure, which can be built at low cost and with low latency is vital. Le Brun stresses the role of the cloud in delivering these pillars of success.
“Do you remember those TV programmes you used to watch as children, the ones that warned: ‘Don’t try this at home’? I’d give a variant of this warning with generative AI: ‘Don’t try this without the cloud,’” he says.
“The cloud is the enabler for generative AI, making available cost-effective data lakes, sustainably provisioned GPUs and compute, high-speed networking, and consumption-based costing.”
Le Brun also says that companies should address their data management before undertaking a generative AI strategy, as siloed or fragmented information will stymie their adoption. Leaders will have to address the company’s values to ensure that the technology is used in a way that benefits stakeholders at every level of the organisation.
“The world of generative AI is incredibly exciting, but technology rarely operates in a vacuum,” Le Brun says. “Face the law of unintended consequences. Start by considering your stance on ethics, transparency, data attribution, security, and privacy with AI. How can you ensure the technology is used accurately, fairly, and appropriately?”
Sivasubramanian says the widespread adoption of FMs should democratise the use of generative AI throughout an organisation. If these steps are achieved, leaders should be able to achieve that goal, while also rapidly scaling innovations that drive value for their customers and shareholders.