For decades, mainframe computers have formed the backbone of vital business infrastructure. They remain the source of truth for thousands of companies, powering processes that have brought these firms commercial success over the decades.
Several years on from the start of the cloud revolution, some of the world’s largest firms still depend on mainframes to process core back-end workloads ranging from card transactions to insurance claims. But the reliability of this ageing application portfolio is diminishing steadily in many cases, while maintaining this legacy is becoming increasingly difficult.
Recent research by LzLabs sheds some light on why enterprises are seeking to move away from legacy mainframe environments and towards nimbler cloud-based alternatives.
Dragging systems into the modern day
Mainframe modernisation projects are being driven by several factors. These include the desire among firms to cope with a decline in legacy tech skills, reduce their IT costs; access new revenue streams; power innovation; improve their stability and security; and increase their agility in a competitive market.
In many cases, a mainframe will act as a firm’s so-called system of record, providing the vital back-end applications that underpin how the organisation works. While its employees will interact with mobile apps and web interfaces, the core data processing behind these systems still happens on platforms that were developed decades ago using code such as assembly language.
“You get to the point that they were written in ways that worked 40 years ago using languages that were around 40 years ago,” says Thilo Rockmann, CEO of LzLabs.
That’s fine when it works. But the disconnect between the modern tech that employees interact with and the legacy IT actually powering their enterprise becomes all too clear when things go wrong. A mishmash of coding practices over several years and a lack of documentation have left many firms with incredibly complex environments that can be a huge challenge to untangle.
Uncovering old code
As proof of how tricky it can be for organisations to fix such problems when they arise, Rockmann cites an example from a migration project that LzLabs worked on recently.
“We found a piece of code that had previously been compiled when one of our experts was aged three, which was 49 years ago,” he says. “It hadn’t been recompiled since. That code had just been sitting there and running.”
That’s all very well if you can find someone with the institutional knowledge of how the code works if a problem with it were to arise. But such skills are hard to come by in an increasingly competitive market. Modern programming languages such as JavaScript, Python and SQL dominate among developers. But those that many mainframes still rely on – Cobol, Assembler and PL/I, for instance – have virtually died out.
Rockmann warns that the dearth of expertise in legacy coding is only part of the problem. “You have to understand the mainframe and you have to understand the specific subsystems, but can you then understand how the company put it all together? The institutional knowledge is getting lost too,” he says.
The case for change
Mainframe maintenance is becoming ever more onerous and expensive, but modernisation is no simple task either. Teams must unravel complex interdependencies among programs and databases that have evolved over decades. Nonetheless, it’s important to bite the bullet. Inaction will only exacerbate their problems as more institutional knowledge is lost to retirement.
If this reason alone weren’t enough to stimulate action, several tech trends are serving as push factors too. The rise of cloud computing has been a big driver of modernisation, for instance, offering an affordable alternative platform for mainframe workloads. Spending on cloud services has been growing rapidly as a share of total IT expenditure, from 21% in 2018 to a projected 35% this year.
The great migration can be seen not only through budgetary allocation, but also through actual action. Gartner has predicted that public cloud infrastructure will host 79% of enterprise data by 2025, compared with 56% in 2020.
The reason? It’s simply the modern way of working, says Rockmann, who adds: “The cloud provides the ability to operate in the way that people have got working with distributed systems now, and that most companies are moving towards for anything they’re doing.”
It’s only natural, then, that data would be moved to places where users expect it to be. But there are plenty of other benefits to migrating from mainframe-based systems to the cloud.
Unlocking new opportunities
Other factors pulling enterprises towards modernisation include the ever-increasing demand for more processing heft; the advance of powerful tech such as AI; and competition from agile digital players. Most people do accept that mainframes are ill-equipped to support the current pace of innovation, let alone future developments.
As Rockmann puts it: “You need business applications that support, not hinder advances. And legacy mainframe systems are not written in a way that modern systems would be.”
The business case for mainframe modernisation may seem clear, then, yet a significant number of organisations still haven’t accepted it and joined the cloud revolution.
Several factors may be at play here. LzLabs’ research suggests that the most common inhibitor is a failure on the part of IT chiefs to convince their firms to fund such projects.
Rockmann suggests that the cost-benefit analyses typically used in making the case for modernisation budgets often conveniently overlook key considerations. Such bias is problematic, either way it goes.
“There are lots of ‘apples versus oranges’ comparisons that people use on both sides of the debate to either make it look super-opportune to go to the cloud or make the move look terrible,” he says.
Concerns about achieving a return on investment and a lack of support for modernisation in the C-suite also rank highly in LzLabs’ study as barriers to change. Such factors are prompting many leadership teams to maintain an ‘if it ain’t broke, don’t fix it’ approach.
Yet, with legacy skills disappearing steadily from the workplace, this is becoming an increasingly risky gambit, warns Rockmann, who stresses that programmers who were just entering the workforce when mainframe-based systems were first being installed are likely to be at least considering retirement. Unless they’ve shared their expertise with others in the organisation, their valuable knowledge will soon be leaving with them.
Convincing the C-suite
Another case against making the leap that’s often accepted at C-level is that swapping legacy code for cloud-native applications is an inherently risky business if not executed carefully.
To overcome that barrier, Rockmann recommends not muddling through an attempted modernisation alone. By enlisting an external provider with appropriate expertise, a firm can re-platform its mainframe applications to capitalise on cloud agility and scalability while protecting its system of record.
An effective partner will offer ways to do so gradually via an incremental migration, rather than a comparatively risky big-bang approach. With this kind of specialist support, firms can drag their foundational mainframe infrastructure into the modern era and realise a host of benefits in the process.
The case for mainframe modernisation can be a tricky one to make, especially amid the ongoing cost-of-doing-business crisis. Overcoming complacent C-suite inertia is sure to be a challenge when things seem to be working well enough. But, once they realise that the tech supporting essential processes represents a ticking time bomb, business leaders may be more inclined to invest in the future of their enterprise.
For more information, visit lzlabs.com