Don’t Just Collect Data, Analyse It And Use It

Using Big Data

The rate and efficiency with which companies are collecting Big Data is constantly increasing. Because of this the challenges of processing and distributing analysis on that data to users is becoming unmanageable. A new approach to analysis is needed.

The Big Data problem is accelerating, as companies get better at collecting and storing information that might bring business value through insight or improved customer experiences. It used to be a small specialist group of analysts that would be responsible for extracting that insight, but this is no longer the case.

We are standing at a nexus between Big Data and the demands from thousands of users – something that we call “global scale analytics”. The old architectural approaches are no longer up to the task and this new problem needs radical new technology. Continuing with the old approach Big Data will fail to reach its true potential, and just become a big problem for companies.

Analytics applications now regularly serve the needs of thousands of employees to help them do their job; an employee can need access to hundreds of visualisations, reports and dashboards. The application must ready for a query at any time, from any location and the results must be served to the user with ‘Google-like’ response times; their experience of the web is the benchmark by which they judge application responses in the work environment.

There are some great point solutions, but the problem is the integration between every part of the stack – the stack only performs as well as its weakest link. The industry has only been working to solve half the problem, data collection and storage, rather than looking at the full picture which also includes analytics and visualisation. Loosely coupled stacks scale poorly and have a huge management and resource overhead for IT departments, making them uneconomical with poor agility.

Looking at the end-to-end Big Data analytics problem requires an architecture that tightly integrates each level of the analytics stack, taking advantage of the commoditisation of computing hardware to deliver analytics that can scale with near perfect linearity and economies of scale, to deliver sub-second response times on multi-terabyte datasets.

Regardless of your application, if you have thousands of users, exploding data collection, highly dimensional data, complex visualisation or globally distributed user base, then the big data problem will keep getting bigger. With every day it grows you are playing a game of diminishing returns. Businesses need to look at how they make their analysis as efficient as their data gathering. We are in a new era of data exploration that demands a jump in the scale and performance of analytics applications to achieve global scale analytics – what’s the point in collecting all that data if you can’t use it….?

Kevin Spurway

Kevin Spurway is responsible for marketing MicroStrategy’s Analytics solutions. Before joining MicroStrategy in 2010, Kevin was Chief Marketing Officer at Altosoft, a business intelligence software company (acquired by Kofax). Prior to that, he was Vice President of Marketing for Appian, a recognised leader in the business process management (BPM) market. Kevin is a graduate of Dartmouth College and Harvard Law School.