Big data analytics: Tackling the historical data challenge
Big data is noisy, messy and, frequently, inconsistent. To make sense of it, companies are investing time and money in an array of technology solutions. However, some much-touted approaches are actually quite costly and in many ways, have not yet addressed the expanding set of historical information that companies want to mine.
A solution to these shortcomings is to take a hybrid approach – combining the performance and speed of in-memory processing with the vast data storage capabilities that a traditional on-disk approach offers.
The challenge: speed versus space
To get meaningful results from big data, companies need to slice and dice their data in many different ways to work out which parts are worth using. Considering the enormity of the task, for many, this requires– above all – speed.
One approach to dealing with this data is in-memory computing. This delivers both performance and speed, allowing companies to analyze data dynamically and quickly. For many companies, however, it is not always the most feasible solution. This is because of the high cost of keeping and running huge data sets in-memory.
“Combine the performance and speed of in-memory processing with the vast data storage capabilities that a traditional on-disk approach offers”
While hardware makers have been building machines with more and more memory, even when the memory and storage problems are solved, those using more traditional database approaches to work with historical data will often struggle with speed at scale.
A hybrid solution
For years, the financial sector has successfully combined inmemory processing with on-disk storage to manage trade data, which adds up to billions of records per day, in a fast, yet efficient manner. Other industries facing similar data can learn from their example.
With a more hybrid approach, companies can combine the high performance and speed capabilities of in-memory while solving the storage issues by putting the vast historical data sets on disk. By bridging available technologies, companies can deliver on all counts – including cost.
Crucially, by folding in a high performance programming language right in with the data, users can interact directly with their data in one place. This gives a super-charged in-memory and on-disk historical database at their fingertips and the ability to deliver results in speeds and complexity previously unavailable. In a world where the most interesting information is also the trickiest to manage, this is without doubt the Holy Grail of data management.
The Future with Data Analytics
Insights Are Only as Powerful as the Data Driving Them
3 Objections to Data Transparency Hurting Your Business
Capitalizing on Data Analytics Using Automation
By Chris Tjotjos, VP, Cisco Solutions Practice, Black Box...
By Laura Jackson, Sr. Manager-Risk Management, ABS Consulting
By Jason Cradit, VP of Information Systems, Willbros Group
By Steve Garske, Ph.D., Senior Vice President & Chief...
By Roman Trakhtenberg, CEO, Luxoft
By Renee P Wynn, CIO, NASA
By Mike Morris, CIO, Legends
By Louis Carr, Jr., CIO, Clark County
By Andrew Macaulay, CTO, Topgolf Entertainment Group
By Dominic Casserley, President and Deputy CEO, Willis...
By Dave Nelson, SVP-Portfolio Lead, Avanade, Inc.
By Michael Cross, SVP & CIO, CommScope Holding Company Inc.
By Pauly Comtois, VP DevOps, Hearst Business Media
By Dan Adam, CIO, Extreme Networks
By Matt Schlabig, CIO, Worthington Industries
By David Tamayo, CIO, DCS Corporation
By Scott Cardenas, CIO, City and County of Denver
By Marc Kermisch, VP & CIO, Red Wing Shoe Co.
By Brian Drozdowicz, VP, Digital Services, Siemens...
By Les Ottolenghi, EVP and CIO, Caesars Entertainment