Is Big News Too Big For Some?
Published on 12/06/2012 by The Brisbane Times
It is going to take some nous to make sense of the business intelligence in big data.
Companies will struggle to replicate the success of tech giants Google, Amazon, and Apple when it comes to garnering commercial insights from vast amounts of expanding data, say those at the crunching face of business intelligence.
The data analytics trend that began with the humble spreadsheet has more recently manifested itself as ‘big data’, in which companies harvest hundreds of petabytes of information from a wide range of systems and sources, and generate new commercial insights by applying different techniques, models, and algorithms.
It has fuelled the success of tech giants such as Google and Amazon and software vendors have cashed in by packing the idea in new products and marketing hype
However, it is still early days for the industry, according to Western Health business intelligence (BI) manager Mathew Long who oversaw the installation of BI applications to facilitate the company’s corporate and regulatory reporting requirements, primarily linked to government funding and legislation.
“In a lot of corporate environments data would be naturally segregated into their own silos, one process might be captured in one system, another might be captured in a separate system and whilst these can be displayed side-by-side, they’re not necessarily cross-linked”
Traditional operating environments are adequately served by current business intelligence tools, he said.
“What we’ve tried to do with our BI is break down these barriers and provide more integrated silos of information.”
“[Big data] is certainly one of the buzzwords at the moment. We’re coming at it more from the business intelligence space, not so much around the big data, and whilst that is somewhere we could go in the future we’re definitely not there yet.”
At the forefront of the big data movement is Kaggle founded in Melbourne by former Treasury and Reserve Bank of Australia statistician Anthony Goldbloom. It uses the lure of big prizes to cultivate a community of data analysts and scientists, that competes to create efficient data models and algorithms, and solve a problem for an organisation or company.
Goldbloom said “big data” is primarily a “play thing” of large organisations, but he believes that one day this technology will be accessed by the masses.
“We know we’re moving into an age of big data [because] the applications have moved from helping banks predict who’s going to default on a loan and helping insurance companies predict who’s going to crash their car”
“I like the William Gibson quote ‘the future is already here it’s just not evenly distributed’. At the moment, we’re seeing the most innovative companies, organisations and governments relying on data-driven insights. As we realise more positive case studies, the wholesale use of big data will become more adventurous.”
Microsoft technical fellow David Campbell is overseeing the software vendor’s research into big data, and strategies to bring it to market via its database product SQL Server 2012. He admits there are barriers to uptake and to make it relevant for companies of all sizes. He said the market is only a quarter of the way into the transition from existing technologies, such as BI, to true big data.
One of the killer applications of big data, or ’ambient data’ as he calls it, is to use different models and algorithms to derive new insights from historical data, for example decades-old clinical trial information for big pharmaceutical and clinical companies.
He said ‘ambient data’ could make sense of old, disparate sources of data stored in a single repository or container, a process akin to putting all your old cards and receipts into a shoebox.
“A lot of people start by saying I don’t have [data captured] in the right way to derive these insights, it’s such a new area. You can take hundreds of petabytes of data, do some analysis on the patient data, which can alter the course of treatment.”