Companies are drowning in data and most have only a partial view into their own operations. But integrating information into clearly defined formats of files and layout gives businesses the ability to understand their processes and make informed decisions.
Organisations are realising the urgency of using integration and access software – defined by analyst house IDC as enabling the access, blending and movement of data among multiple sources – and by next year global spending is set to hit $6 billion (£3.9 billion).
“Companies have many sources and different funnels of information listed in multiple formats,” says Alys Woodward, IDC research director, adding that this has to be addressed.
Businesses have to create an integrated view, otherwise it’s not clear how well they are performing, how to mitigate risks and how to enable people to collaborate
Philip Howard, research director at Bloor, adds: “Data integration is like a road that connects two cities. If you need two applications to work, they must be able to communicate.”
In one classic form of integration, a company may pull together information on customers, a pipeline of sales, product types and stock availability. If information is kept in standard formats, collation is easier, but the growing complexity of data is quickly putting companies on the back foot.
“Businesses have to create an integrated view, otherwise it’s not clear how well they are performing, how to mitigate risks and how to enable people to collaborate,” explains Ted Friedman, a vice president at Gartner.
Many firms begin on the integration path by writing their own code. As the scale of the task becomes apparent, they turn to more advanced tools that graphically illustrate and automate data.
Regulated industries, such as financial services and health, tend to be ahead of the curve in using data integration tools, Ms Woodward argues, “because of the requirement to demonstrate what they are doing”. Telecoms and retail, facing vast volumes of data, have also taken large strides.
One company to have pulled together its many sources of information is large US hospital alliance Premier. Keith Figlioli, senior vice president, says that given the prevailing shift to payment based on quality and efficiency, its 3,400 hospital members have to analyse complex financial, operational and clinical data quickly.
The alliance developed a single platform for near-real-time information, processing 3,000 data transactions a second. “We now have the ability to take data from any of our providers’ transactional source systems and use it to provide truly integrated analytics,” Mr Figlioli says.
In the UK, car manufacturer Honda had a serious challenge as its 200 dealers used myriad management systems and file formats. It drew together and formatted the information so that it could better understand customers and sales, and predict part requirements.
Many businesses tend to favour a tool called ETL (extract, transform and load). It enables the process of selectively pulling from various databases, transforming the data into a common format and merging it. The operation is usually run in batches.
More recent systems also help companies manage the different interfaces between applications and visually map integration from a source database to the target. Equally, new technology can take unconnected data and learn relationships.
As the technology develops, so does the severity of the problem it has to tackle. The somewhat encouraged growth in unstructured information, as companies build huge ‘”lakes” of data, leaves them with a repository of files that are rarely formatted or linkable.
This leads to serious accuracy concerns, according to Gartner’s Mr Friedman. “Companies need to look beyond the plumbing and think about the quality of what travels through those pipes,” he says.
Focused investment in information governance and quality is the answer. “Having more technology to help the process around data governance and master data management is the key,” says Mr Figlioli. The “messy” data in the healthcare industry, he explains, is prompting large governance investment.
Another challenge is the struggle to analyse the data quickly. There is a growing appetite for real-time information, particularly when a customer is on the phone or a website needs to create a recommendation. Correctly assembled data, combined with predictive analytics, are crucial.
The situation is further complicated by the growth in cloud computing, which has led to data being held outside company walls. IDC’s Ms Woodward warns: “Companies need to understand it may cost them to access the data layer from the cloud provider; or worse, they may not be able to access it at all.”
The expansion in connected home, business and city devices as part of the internet of things is leading to a large amount of extra data being located in many places. “It will hugely ramp up the sort of technology required,” says Bloor’s Mr Howard, “and many of the connected devices have their own format for information.”
As the challenges increase, companies must invest in both experienced management and advanced technology if they are to pull together all their data for quickly actionable decisions.