Data Variability challenges need to be Solved
According to IBM, everyday, we create 2.5 quintillion bytes of data – so much that 90 percent of the data in the world today has been created in the last two years. While the data conversation has centered on the volume and velocity aspects, the industry has failed to address the variety and variability problems associated with the increase in data. Technology companies need to focus on developing solutions designed to solve the data variety and variability challenges that enterprises will face in the next two to four years.
The sheer influx of data is creating an increased demand for data driven solutions to help solve the challenges associated with the data explosion. For example, the volume and variety of data coming from IT systems itself is drowning departments and the inability to apply market intelligence to this data is preventing them from extracting actionable insights from their investments. In order to help enterprises address these challenges, we foresee a continuation of the trends from software to software-as-a-service (SaaS), to platform-as-a-service (PaaS), to the growing need to data-as-a-service (DaaS). DaaS, for example, can help to provide an industrialized approach to clean and enrich IT data to be directly leveraged across all IT processes.
The biggest challenges that entrepreneur face is that when they create innovative enterprise solutions to address a big market need, they usually fail to recognize the legacy anchor. Failing to do so prohibits enterprises from becoming the pervasive companies that they need to become in order to be successful. When building companies and products today, entrepreneurs must account for the legacy anchor, be able to work with it and only (after successfully working with it) can the enterprise replace the legacy anchor. That is the biggest challenge that entrepreneurs face today.