Go back 60 years, and few could have predicted the dramatic impact IT has had on our lives. Even IBM President Thomas Watson predicted the world would need only five computers. While this statement has often been mocked, I doubt anyone would have guessed that the consumerisation of technology would lead to their currently being an estimated 2.35 billion connected devices.
In addition, while the developed world may have had its fill of PCs, tablets and smartphones, the revolution is only just beginning in the developing world, which will further expand upon the ‘internet of things’ – with even more devices generating data.
Global data levels originally took about a century to double, but these are now doubling every 14 months. At this rate, global data levels are set to exceed 40 zettabytes by 2020 – the equivalent of 5,200GB of data for every man, woman and child on Earth.
However, unexpected growth often leads to unforeseen problems. While data analytics techniques have been used for decades, what we are dealing with now is on a much vaster scale. New techniques are necessary if we are going to be able to cope with the data this network of devices is generating.
It’s an issue that concerns businesses and consumers alike. Both stand to benefit greatly if they can find effective ways of processing this data. Although the quantities of data are daunting, ‘the humanisation of IT’ could provide the next stage of technological development as well as the answer to our analytical challenges and result in an influx of technology that requires little or no training to use and suits our natural analytical abilities.
We often forget that all humans possess a natural ability to analyse data when presented with information in the right way. Outside of technology, humans are automatically processing thousands of pieces of information every second. Think of a hunter foraging for food in a forest, continuously making a multitude of observations and decisions to find what they need.
They will instantly distinguish between edible and inedible fruit, locate the best conditions for mushrooms to grow, which are poisonous and where the much sought-after deer is likely to be found. Although a primitive analogy, this acknowledges the natural processes our brains use to digest information; the processes of association, comparison and anticipation. It is these effective processes which, to a certain extent, are not accounted for in modern data analysis and analytics.
The human brain strives to make associations; we are constantly categorizing and connecting, searching out both the important features and the warning outliers. The hunter knows to look under oak trees for mushrooms as previous experience has taught him that they often grow there. These associations are only the first step.
The hunter does not settle with the first batch of mushrooms, but will find others and compare them against what they have already found; deciding if they are bigger or smaller for example. They may also draw on past experience to know where the ripest fruit is to be found or where the deer herd will be found grazing at a particular time of day.
Just as our past experiences enable us to make sense of the present, they also help us to anticipate the future. The hunter knows to eat certain types of food as from past and present experience; they’ve learnt that eating those foods aids survival.
Every one of us uses these natural analytical tools on a daily basis in our everyday lives without even noticing them. Furthermore, these capabilities also help us in the world of business; they are what we use to make sense of complex problems. However, most of the technology we have access to doesn’t complement or extend these natural abilities.
Despite the consumerisation process, tech in the business world is still needlessly complicated – designed by experts for experts. If you think about tech initiatives at work, how many require training before full use can be made of the product or service? Consider Excel for example, a universally used programme with functions that the vast majority of never make use of. The problem with complicated technology is that it ultimately hinders productivity at user level, to the detriment of the business.
By humanising IT processes, business leaders could empower the whole workforce with the ability to analyse large data sets, not just a select technical few. Google’s search engine is a great example of what is possible – a kind of technology which appeals to our natural sense. Whilst there are hidden complex algorithms that allow it to process huge quantities of global data, the intuitive interface can be used by practically anyone.
Smartphones and tablets have also helped to make IT more accessible, providing another step to analysing data in the natural world. Touch and swipe gestures on a screen for example, are far more instinctive and usable than the old fashioned mouse and monitor which used to dominate consumer technology.
By using the tools that are already out there and understanding natural behaviour, businesses could provide humanised systems that enhance our innate analytical skills. The rewards of making greater sense of company, customer or user data are great and can be extended far beyond the corporate world – in the future, we may be able to use these tools to help solve bigger problems or even social issues such as healthcare or poverty. If we do not adapt however, we will remain submerged in the ever-growing data deluge.