With trust as your north star, AI can deliver impact
Trust is a challenge in financial services and new technology can raise concerns – but the right approach to AI could transform customer relationships

In a world where on-demand digital services have become part of everyday life, customers increasingly expect the same level of immediacy when it comes to companies managing their finances. But, for many people, financial services organisations are falling short. According to Forrester research, banking customer experience declined in 2024, while a study by SAP in 2023 found that only 22% of UK consumers were satisfied with their bank’s support, and more than half of UK small- and medium-sized businesses said they were reassessing their bank’s suitability for them.
In this climate, the potential for AI to change how the financial services industry engages with customers is huge—but organisations must exercise caution. When it comes to customer money, trust is non-negotiable. Therefore, to harness the power of AI, banks and other financial institutions must bake trust into their AI strategies from the outset.
“Financial institutions need to think about infusing ethics, inclusivity and responsible AI into every part of the pipeline, including the way they ingest data, what data they ingest and how they train the model,” says Yiannis Antoniou, head of AI, data and analytics at tech consultancy Lab49.
By taking this approach, firms can demonstrate to customers, regulators and internal stakeholders that all AI-based decisions are free from bias and discrimination, Antoniou says.
“This is fundamental to creating trust in how banks use AI systems,” he adds.
Financial services organisations must also have appropriate oversight measures in place and dedicated teams to monitor AI systems to ensure they are operating fairly and ethically.
“Without human oversight, you’re asking for trouble, especially in something as sensitive as financial services,” says Camden Woollven, an AI specialist at risk management and compliance company GRC International Group. “These risks can be managed with the right governance frameworks, but it takes a lot of effort and monitoring.”
When looking to integrate AI tech, it is important for financial firms to create a safe environment for testing and experimentation so they can be confident AI tools will work when they are eventually rolled out to employees or customers.
“We have platforms where people can try out use cases in a safe, hermetically-sealed environment and see how far they can take AI tools,” says Rajesh Iyer, global head of AI machine learning for financial services at Capgemini, a consultancy. “There’s a lot of robust testing that needs to happen before banks can roll AI products out.”
Having an environment to safely test AI could potentially lead to innovative new products or services that are beyond the current understanding of what is possible. Rob Wild, managing director of L.E.K. Consulting’s digital and AI practice, likens this to the early days of the internet.
“We’re still at the ‘excitement about email’ stage of AI,” says Wild. “Think about what the internet is today and everything that’s come out of it, whether it’s services like Netflix or Deliveroo or having a video chat with your family member on the other side of the world. All those things didn’t feature in discussions during the early days of the internet.”
AI also has the potential to help banks reach customer segments that were previously hard to serve at scale. For instance, AI can help with lending to small businesses, where borrowers tend to be very diverse and therefore challenging to cater for in a uniform way.
“AI means everyone can build better propositions for less standardised segments. They then have the potential to become big segments,” says Wild.
One of the biggest opportunities with AI is elevating the customer experience by providing personalised banking at scale, offering a level of service that was previously only available to high-net-worth customers.
“AI systems will be able to analyse thousands of data points about financial behaviour to offer customers personalised advice or relevant product recommendations,” says Woollven.
For example, AI could see that a customer is spending a significant amount of money on Uber rides and therefore recommend a car loan that would fit in their budget and would save them money in the long run, she says.
“We’re moving towards a world where sophisticated financial advice isn’t just going to be for the wealthy,” Woollven adds.
The rise of agentic AI—autonomous AI systems that operate without human prompting—also has the potential to change how financial services are delivered. Take a standard mortgage application. Agentic systems could understand the context of what is being asked and then break down that process into individual steps, with each step automatically assigned to an AI agent to complete, says Iyer. Agentic systems could even pause the process if a human needs to perform a task (to conduct a home inspection, say), before picking up the process again from where it left off.
“Sometimes it’s not all going to run in a two-minute window, it might be that the first three steps are going to take 15 minutes, and then you have to wait for seven days for the next step,” says Iyer. “This ability to pause automation is transformational because not all processes can happen synchronously.”
For now, much of the agentic AI experimentation is focused on internal processes and workflow enhancements rather than external-facing services given the technology is still in its infancy.
“It will eventually reach customers, especially as the technology becomes more reliable and cheaper to use, and when clear use cases emerge to serve customers better,” says Antoniou.
While there is a need to take a cautious approach to AI adoption, organisations that drag their feet risk being left at a competitive disadvantage to those that are further along on their AI journeys.
“We’re going to start seeing a divide between financial institutions who have embraced AI and those who haven’t,” says Woollven. “What’s probably going to happen is those that lag will find themselves in a position similar to retail stores that missed the ecommerce revolution.”
Ultimately, AI will blend into the background where customers trust it to be part of the experience by default, transforming how financial services organisations operate and improving customer engagement.
As a leading technology firm, SAP aligns with these perspectives and recognises the transformative potential of AI in financial services. The following articles build on these external insights, reinforcing that trust, governance and a strategic approach are essential for AI to deliver real value.
Avoid ending up in an AI cul-de-sac
Financial institutions must navigate AI carefully, focusing on incremental change, solid data foundations and a clear strategy to avoid taking a wrong turn on their journey

AI is alluring for financial-services organisations eager to improve efficiencies and boost profitability. But, thanks to the technology’s rapid development, firms seeking to deploy AI tools can easily become disoriented and hit a dead end. In such cases, they may have to start their implementation strategies from scratch – a time-consuming and costly exercise. To avoid making any wrong turns, firms need a clear strategy that favours incremental change rather than a complete overhaul of their systems.
There are several challenges financial institutions need to address to ensure their AI projects remain on the right path. First, they must learn from the past, when individual business functions would invest in point solutions that lacked integration, creating a fragmented tech ecosystem. So says Stuart Grant, head of capital markets at SAP.
“The difference with AI is that it’s moving so quickly. If you don’t have oversight and an understanding of how it’s being rolled out across the organisation, you risk stagnating development by following a route where the technology becomes outdated,” he says.
Thanks to legacy issues around fragmentation, data is often stranded in silos. As AI systems are only as good as the data they are being fed, financial institutions must clean and centralise data so that it is usable. This will ultimately dictate the success of an AI strategy.
“If financial institutions don’t sort their data out first, before they implement their AI strategy, they will end up having to go back to the beginning to rectify the data problems,” Grant says. “This is a classic case of ‘more haste, less speed’, but you also must understand that this is a constantly moving target.”
Firms must also address perceptions around AI. Staff, for instance, may fear losing their job to AI, while customers may worry that AI exposes them to risks such as bias. That’s according to Anuj Kumar, industry strategic advisory, UKI banking at SAP. Internal demographics may lead to divergent views on AI. For example, older workers may be more reluctant to embrace AI tools than their younger colleagues.
“The challenge is to get everyone on the same boat,” says Kumar.
Firms must consider these challenges when developing their approach to AI. For financial firms, a robust AI strategy will also focus on the two essential technology systems: the core banking system, which manages customer assets, and the ERP system, which manages the organisation’s assets.
Financial institutions should ask themselves how they could bring AI agents – autonomous AI task-performers – into their core banking or ERP setups to simplify their processes, Kumar explains. This means identifying priority areas and then focusing first on what delivers the most value, he adds.
“Firms must also decide how to position their AI strategy. Is it one big transformation programme, or is it an enabler of multiple change initiatives across the business?” Kumar says.
Finding the right positioning also means leaders must think about the outcomes they want to achieve from their AI strategy, Kumar adds. For example, can AI help banks innovate and drive differentiation, or even reshape banking away from what has traditionally been seen as a utility product? Whatever the objective, financial institutions must keep their employees informed of the strategic direction and maintain a feedback loop for ongoing dialogue.
“There has to be a way for the organisation to leverage its resources to get as much information on this fast-moving topic as possible,” says Grant. “Success ultimately hinges on a good understanding of, and confidence in, your data, ultimately ensuring that the organisation is educated, not just on the AI topic, but on how and why AI is being leveraged.”
Once the strategy is in place, there are practical steps financial firms can take to get started on their AI journey. One quick win is to use what is already available through existing tech systems. SAP, for example, has been providing embedded AI capabilities in its products for several years, though not all users are aware of what is available and what the technology can do, says Grant.
“Just start with understanding what is available today without any investment because it’s already there and embedded in the software and services that firms are already using,” he says.
Firms should also focus on areas where they already have access to the right data to get going, Kumar adds.
Key to all of this is ensuring the strategy is ultimately about people and driving change across the business, where AI is the enabler, as opposed to treating it as a technology project.
“If it is seen as a technology programme, because of all the noise and the perception of AI, it has a higher risk of failure,” says Kumar. “The biggest risk of taking a wrong turn is that you treat your AI strategy as a tech programme and then put change management around it – it should always be the other way around.”
The risk of stumbling down the wrong path is also why AI projects should be approached as an evolution rather than a revolution. Although the technology has the potential to fundamentally transform the way financial-services organisations operate, the quickest route to success is through well-managed incremental change.
“This is about seeing what is already available to you and how you can start using that set of solutions and get going right now,” says Kumar. “It is about experience, experiment and then execution. We’ve always talked about failing fast. This is about learning fast, setting the right tone and executing the strategy efficiently.”
While the initial AI boom saw many projects launched as proofs-of-concept, a significant number have since stalled or been shelved. To move forward, banks must take a more strategic approach – building on and optimising existing AI investments rather than chasing only the latest innovations.
This measured approach keeps firms on course, enabling them to execute their AI strategies without costly detours or landing in unwelcome cul-de-sacs.
Plotting AI’s evolution in financial services
Whether they’re talking about it, experimenting with it or starting to bake it into business processes, most financial organisations are on an AI journey
The human-AI crossroads: people shouldn’t be an afterthought
Leaders must approach change management carefully, ensuring clear communication to support employees and enhance AI’s role in the workforce

The potential for advanced AI to take on ever more complex tasks is going to reshape what the workforce of the future looks like for financial institutions. Given the integral role financial services play across industries, getting the approach right is crucial. With a low appetite for failure – particularly when it impacts employees or end customers – a holistic approach to AI adoption is essential.
And as employees become increasingly worried about what this means for their futures, organisations need to invest heavily in change management and ensure buy-in across the business. This means providing clear leadership on how the business is approaching AI and how the technology is augmenting the human workforce rather than replacing it.
For Stuart Grant, head of capital markets at SAP, the responsibility for communicating this cuts across the C-suite. Part of the messaging falls to the chief technology officer, who is responsible for making sure there is a clear view of the approach, Grant says. The chief risk officer is also critical for accountability around decision-making, and the CEO is responsible for ultimately driving the AI culture, he says.
Some organisations may also seek to install a dedicated chief AI officer to oversee the policy definition and direction, and set the boundaries within which AI systems will be allowed to operate, says Anuj Kumar, industry strategic advisory, UKI banking at SAP.
“Leaders have to ensure that strategically they have done enough thinking around AI policies, rules and regulations,” Kumar says.
When determining which tasks to delegate to AI and which to keep with humans, many financial services organisations will prioritise offloading lower-value, repetitive work. This allows employees to focus on higher-value tasks that drive greater impact.
“It is not a complete elimination of tasks and/or individuals, it’s about executing tasks differently,” Kumar says. “It will be a complementary relationship with AI and humans working together. It’s important to set the right boundaries around where roles begin and end.”
One area where AI is likely to ease the strain on human workloads is customer service. For example, it could enable chatbots to become smarter and more efficient at handling customer queries. It could also improve risk management capabilities, such as better identifying fraud by being able to scrutinise more transactions than would be possible with only human resource.
“There aren’t enough people with the knowledge to do that themselves, so this is not taking anything away, it’s just expanding capabilities,” says Grant. “AI will become prevalent to the point it’s just something everyone uses like Excel in their day-to-day jobs. It’s not so much a case of replacing the human as augmenting what the human does.”
Much of this is going to be made possible with the advance of agentic AI systems that can autonomously complete tasks without explicit programming. In addition to customer service and fraud detection – areas where you need scalability – AI agents could also potentially be deployed to find new investment and trading opportunities.
“For a number of decades now we’ve had hedge funds that leverage mathematical techniques to find very niche opportunities in the trading environment; agentic AI enables you to expand on that,” says Grant. “What’s going to drive adoption is profitability – so where can you save costs, but also where can you improve the service that you’re providing as a financial institution.”
Firms also need to provide change management around how workers transition away from manual processes and human interaction to a world where workers are engaging with AI outputs.
“It’s not just about educating individuals on how to take a new input and act on it, it’s a behavioural change because now you’re taking information from something that you can’t question or debate with,” says Kumar.
That change management process is slightly easier to navigate now than it would have been a decade ago given that many new workers are digital natives – something that will increase even more over time as workforce demographics continue to shift.
“We’re now at a point where a significant percentage of the workforce has grown up with a smartphone in their hands, and so people expect to have something like AI to help them,” says Grant.
That demographic shift will also likely speed up the rate of change, impacting the way organisations operate, Grant says. This means financial services firms need to create a much more dynamic operating model that allows them to move quickly as AI advances.
“It could be that in five years, you’ve got AI capabilities identifying more niche, granular business model ideas and you need to be able to roll those out faster and adapt,” says Grant.
While younger generations might be more willing to embrace AI, one way to ensure wider buy-in is to give everyone an opportunity to play around with AI tools to see what is possible.
“I don’t think people fully realise what AI is capable of until they get their hands dirty and are utilising it and leveraging it to drive efficiencies around the organisation,” says Grant.
To help with this, Kumar suggests financial services firms could set up ‘experience AI’ rooms where employees could drop in during their free time to test AI tools in the context of their own roles.
“People need to first identify inefficiencies in their processes and recognise the need for change, rather than being told that AI is the answer,” says Kumar.
By nurturing a culture where change is embraced and providing simple, clear communication on their AI strategies, financial institutions can ensure employees are invested in the journey, giving those strategies a greater chance of success.