
As talent shortages and growing cybersecurity risks pile pressure on IT teams, innovative solutions that improve resilience and make businesses more sustainable are making a significant impact, transforming the way pain points are addressed.
What are the top IT concerns you’re hearing in your conversations with CTOs?
The first one is talent and getting access to the right people who have the right skills to understand current technology, but also how fast the technology is evolving. The second one that is on everybody’s mind right now is artificial intelligence and machine learning and what impact it is going to have on their business and their workforce, as well as how to use AI as a competitive advantage.
The third key issue is cybersecurity and security compliance, especially in Europe and the UK with the increased regulatory focus around operational resilience. And another topic that frequently comes up at the moment in conversations with CTOs is sustainability, but for different reasons. For some people, sustainability is about cost and trying to reduce energy consumption because of higher energy prices. For others, it is about reputation—customers increasingly expect companies to be more sustainable. And lastly, it is also about regulation and the need to meet CO2 emissions reduction targets.
What are the biggest skills gaps that businesses face?
Everybody’s transitioning to a more digital world and so there is an explosion in the need for people with specific skills. Take cybersecurity – until now, security has been an afterthought, but it’s becoming more and more prominent. We have seen hackers modifying open source packages like the Log4j hack, which became a vulnerability across the entire industry.
We are also seeing problems at the hardware level. All of those require specific skills around security. And AI is exactly the same. The skills you need are not something traditional computer science graduates would have, a lot of it relates more to mathematics.
What can companies do to improve operational resilience?
The way we see operational resilience is that there are five foundations. The first is defining infrastructure as code and automating everything. The second is understanding your software supply chain. Third is making sure that security and compliance are built into your development processes. Fourth is evolving your working practices so they are always fit for purpose. And fifth is having a culture of collaboration and openness.
One way we are supporting the industry on operational resilience is through the Linux Foundation’s FINOS (Foundation of Open Innovation in Financial Services) organisation. FINOS has just started a new group around operational resilience called the Common Cloud Controls project, which is aimed at driving security standards and governance for public cloud deployments in the financial services sector.
What can businesses do to succeed in areas such as the Internet of Things (IoT) and AI?
A lot of the de-facto standards in IoT architecture have been driven by innovations and projects that were delivered inside the open source community. So again, it’s about tapping into this innovation globally. When I talk to a lot of CTOs or executives, sometimes they are trying to replicate products that are already available in open source. So, do you really want to apply your best talent to solve things that have already been solved?
Companies should be focusing on their core competency and what gives them a competitive advantage. If you think about AI – a few years ago, if you wanted to do AI, it was limited to big departments of universities and research labs. But today, thanks to open source, it is accessible to anyone. When ChatGPT was launched, three or four weeks later there were about 20 or so large language models in open source that allowed anybody to start experimenting and using it commercially – not at the scale of ChatGPT, but good enough for the needs of many companies.
What is Red Hat doing to help businesses become more sustainable?
We recently released a piece of open source software called Kepler (see boxout below for more information) that allows companies to measure the electricity consumption of each application they are using in their IT environment. Previously, you could understand the energy consumption of your data centre or rack or machine, but you didn’t have the level of granularity to be able to understand the implication of individual applications.
Kepler gives our customers the ability to measure something that they couldn’t until now. Many were doing it before by way of approximation, and many are finding that what they thought was accurate is not. This enables companies to optimise their energy consumption, for instance only running a particular application at a certain time of day when green energy is available or understanding how making changes to applications would impact energy consumption. With new regulation for carbon emissions coming, this is something that is critical.
Why is Red Hat focused on open source software?
Open source is core to Red Hat – it is our core belief and mission. All of our employees believe in open source as a way of driving innovation, as a way of driving collaboration, and as a way of creating software. What we do is try to bring simplicity and stability to open source for our customers because open source evolves and changes so quickly. We take open source and make it enterprise-ready so that our customers don’t have to deal with that fast speed change themselves. And then we reinvest and contribute back into the open source community and help other people innovate as well.
How does open source help drive innovation?
Open source enables you to tap into the diverse and collective talent worldwide. For me, diversity is critical –everything from gender diversity to where people are from – and open source lowers the entry point for people to innovate. Talent is not exclusive to a number of computer scientists that had the luxury of going to university and getting a PhD. Today, there is so much talent out there, and open source allows you to access that. Those communities are driving innovation and breaking frontiers at a much faster pace than you could in a normal company or a small lab. So that’s how open source can help drive innovation – we can tap into the talent in diverse global communities to create better software.
To find out more about how your organisation can use technology to accelerate its innovation and digital transformation journey, visit RedHat.com
How Red Hat’s Kepler project is working to advance environmental efforts in IT.
For many, the word sustainability evokes images of reusable water bottles, paper straws and household compost bins. For others, it conjures up images of ‘reduce, reuse, recycle’ posters and canvas tote bags at a local farmers’ market.
What won’t immediately spring to mind for the majority is data centres. But as sustainability becomes a cornerstone of government policies, enterprise initiatives and consumer trends, tech leaders have been hard at work building technologies dedicated to helping users monitor how their software usage might drive energy consumption.
In recent years, the rapid growth in workloads handled by data centres has resulted in greater energy usage. This has increased by between 10% and 30% per year and accounts for between 1% and 1.5% of global energy consumption, according to figures from the International Energy Agency
That means that in order for businesses to meaningfully reduce their environmental impact, IT leaders take this into account. And they undertake deeper analysis of the efficiency of their equipment and the tools they use to evaluate the sustainability of their data centres.
Enter Kepler.
Better understanding IT energy consumption
Kepler, or Kubernetes-based Efficient Power Level Exporter, is a project founded by Red Hat’s emerging technologies group, with early contributions from IBM Research and Intel. It is a community-driven, open-source project that captures power-use metrics across a wide range of platforms, focusing on reporting, reduction and regression so enterprises can better understand energy consumption.
Kepler uses proven cloud-native methodologies and technologies – such as extended Berkeley Packet Filter (eBPF), CPU performance counters and machine-learning models – to estimate power consumption by workloads and export them as metrics. These metrics are then used for scheduling, scaling, reporting and visualisation. This arms system administrators with information on the carbon footprint of their cloud-native workload.
The Kepler Model Server continually adjusts and fine tunes its pre-trained models using node data from Kepler’s power-estimating agents. This is how Kepler adapts its calculations to best serve the user’s unique systems and needs. With the knowledge gained from Kepler, enterprise decision-makers can better assess how to optimise energy consumption, address evolving sustainability needs and reach their organisation’s goals.
The future with Kepler
Future innovations in sustainability develop faster with open source community collaboration and an upstream-first mindset. With this in mind, Red Hat is in the process of contributing Kepler to the Cloud Native Computing Foundation sandbox, where contributors explore how to integrate Kepler into their own use cases.
Kepler can enable a host of new innovations in the open-source community that allow service providers to better observe, analyse, optimise and document power consumption of cloud native applications, including:
- Power consumption reporting
Kepler metrics are a time series. This means they can be used to build dashboards that present power consumption at a variety of levels, including containers, pods, namespaces or different compute nodes in the cluster. - Carbon footprint
Kepler’s energy consumption metrics can be coupled by the user with its data centre’s power usage effectiveness (PUE) and electricity carbon intensity to calculate the estimated carbon footprint of the workload. - Power-aware workload scheduler and auto-scaling
Kepler metrics can be used by a Kubernetes scheduler to place the upcoming workload on the compute node that is projected to improve performance per watts, ultimately reducing the cluster-level power consumption. Similarly, Kubernetes auto-scalers can use Kepler’s power consumption metrics in auto-scaling algorithms to determine the resources needed to achieve better energy efficiency. - CI and CD pipelines
Kepler can also be used in the software development lifecycle to help produce more sustainable software products. For example, Kepler can be deployed in continuous integration and continuous development (CI/CD) pipelines for software testing and release. Kepler’s power consumption metrics can help developers measure, analyse and optimise software stacks.
Get involved with the Kepler project via GitHub and learn more on Red Hat’s Emerging Technologies blog.

As talent shortages and growing cybersecurity risks pile pressure on IT teams, innovative solutions that improve resilience and make businesses more sustainable are making a significant impact, transforming the way pain points are addressed.
To find out more about how your organisation can use technology to accelerate its innovation and digital transformation journey, visit RedHat.com