Data can be key to solving business problems, but only if decision-makers have the ability to access it while it’s still relevant. Finding out what action should be taken now is more important than determining what should have been taken yesterday. CDOs are under pressure to manage vast amounts of data at an unprecedented pace: delivering data and insights in real-time is important across every type of business and in every aspect of those businesses.
Product teams that get usage insights in seconds instead of days can increase user adoption and reduce churn. Stock and cryptocurrency analysts who can stream and analyse trading data instantaneously have an advantage over their market peers in identifying the most profitable trades. Ecommerce stores that run analytics with millisecond latency can instantly personalise the shopping experience delivered through a seamless UX, boosting conversion rates and revenues. Regardless of industry or product, real-time analytics improves the KPIs that data leaders - and their stakeholders - care about.
But getting to the stage where product development teams can make use of data at scale is easier said than done. Businesses process huge volumes of data each day from numerous sources, stored in multiple ways by multiple people across the organisation. Data warehouses are well equipped to store unified data that supports specific analytics needs, but handoffs and delays are often hard to avoid while data is being batched and processed. Even minute delays may have ripple effects that impact user experience, and acting on analytical queries in real-time is not tenable, with the value of the insights they provide dropping as the minutes tick by.
Data leaders must rise to the challenge of building or rebuilding data infrastructure that ensures everyone can get the data they need when they need it. In theory, closing the gap between analytics and action can be achieved internally. The instinct may be to throw data engineers, time, and money at the problem in the hopes of improving latency or concurrency metrics. But this also means diverting substantial resources away from the actual business of the business - and away from improving the product.
Jorge Gomez Sancha, co-founder and CEO of Tinybird, knows that engineers are often used to managing their own data infrastructure, which brings with it a score of things to think about, from basic server configuration to the intricacies of developing secure, low-latency, high-concurrency endpoints.
“‘What’s the right CPU level, the right number of CPU cores, or amount of memory? Should I use replication or sharding? How do I keep track of whether everything is up and running?’ They’re thinking ‘we need a database that we then build things on top’ versus ‘we need tools to solve our problems with data’,” says Sancha. “Your developers shouldn’t even need to think about any of that, they should be thinking ‘What’s the next business problem I can solve and provide value?’” Sancha believes that data leaders should focus on enabling developers to apply analytics within the products they build in as close to real-time as possible.
In many cases, the problems data leaders need to solve in order to give developers access to real-time analytics are not unique. Almost every CDO or head of data is confronting or has confronted the challenge of building real-time data architecture. But why is it so challenging to solve? Data warehouses weren’t built for real-time, but data teams have spent the last decade investing in infrastructure and tooling that surrounds and supports the data warehouse. To solve for real-time use cases, data leaders will need to branch out.
But why start from scratch when tools like Tinybird are already solving the problem? Increased performance and precision are core goals for any CDO, and streamlining these processes requires the right set-up. Data teams that use a serverless approach with best-of-breed software can integrate with existing streams, databases, and warehouses, process them through an optimised and simplified data stack, and then provide near-instantaneous access on the other side - often in a way that complements the traditional data warehouse. These serverless tools can plug into dashboards, trigger alerts, power automation, or feed into whatever other data products the business uses.
Sancha points to Keyrock, a Tinybird customer operating in the cryptocurrency market, as a prime example: “they ingest data from markets all across the world, in crypto but also in other assets, and then they are constantly making bets as to where things are going and creating transactions.” Inevitably this involves a massive amount of data, and Keyrock was battling a host of latency, freshness and concurrency issues. Data took many seconds or even minutes to process and hand off, and even went missing entirely, making it challenging to analyse and act on that data in real-time. In such cases, attempting to fix the issue internally was costly and time-consuming: using Tinybird proved a far more efficient solution.
Data teams that add real-time architecture to their stack often discover opportunities to apply the tech to new, tangential use cases that have sat unresolved. Sancha remarks on a client who had initially transitioned to a real-time architecture to reduce the lag time in their analytics platform. This shed light on a new solution to an old problem - could real-time analytics be the key to identifying and preventing denial-of-service attacks on their services? Implementing instant logging and analytics enabled them to turn hours of work into an automated process that could effectively intercept and respond to attacks within minutes. Once the value of real-time data becomes evident in one area, different teams across the business will find that they rapidly develop new ways to apply real-time across everything they do.
Data leaders need to give developers the tools to make actionable use of the data that’s pouring into their databases at a moment’s notice, and waiting minutes to understand what’s happening while their competitors wait seconds or milliseconds is not a sustainable option. They need real-time data architectures with an emphasis on simplicity and performance. Having the data to make critical business decisions isn’t enough; developers and data teams need to be able to build data products that can have a real-time impact at any scale. CDOs and heads of data need to act now to realise this, as the size of their data is only accelerating.
For more information please visit tinybird.co
Promoted by Tinybird