Harnessing The Power Of The Data Deluge

Big Data

All too often, storage is simply taken for granted. After all, the price of 1Gb storage has dropped from approximately $100 in 1997 to around $0.10 last year, and with more options for purchasing storage than ever before – cloud or in-house, tape, SSD or HDD – one might be forgiven for thinking that storage ‘just works’.

But does it? According to Cisco, the total IP traffic in Central and Eastern Europe will be 3.7 Exabytes (3700 Petabytes) per month by 2015. It may seem obvious, but a lot of this data will have to be stored somewhere. Furthermore, this figure does not include the large amount of information currently stored today.

To give a practical example, Lancashire Constabulary is currently using a CCTV system which records car number plate details. Each snapshot is only 25kb. However, across the entire estate, this means that the system can record in excess of 200Gb per day on traffic monitoring alone.

If all 48 county police forces used such a system, it could result in around 9Tb of data every day: 3Pb a year. With data retention laws in force – and new laws coming into play, there is only going to be more strain on storage systems.

A Compound Problem

Technologically, we are also facing additional problems. Storage is becoming increasingly fragmented, as companies simply acquire more low-cost storage in a piecemeal way, rather than unifying storage technologies. The problem with the former is that management costs begin to rapidly outstrip the cost of purchasing and ownership. Indeed, traditional approaches to optimising IT are beginning to reach the limits of what is possible.

So the problem is one not only of size, but also of strategy. With management costs for data rising fast, simply adding more disc drives to the company will soon be ineffective. As an analogy, this could be compared to buying more warehouse space to cope with library books – without adding in some kind of filing system, you won’t be able to find anything.

Businesses could find themselves struggling with a mountain of duplicated data, staff holiday snaps stored on company storage and then backed up and retained for seven years. With this understanding, it is easy to see how the costs associated with storing and managing this data can soon spiral out of control.

Bridging the Gap

Fortunately, there are a range of solutions to the problem. Organisations need to take a holistic approach to storage to maximise the benefits, or teams will simply chip away at the problem with little hope of solving the problems.

There are a number of tactics which organisations can use to manage the data deluge, which are most effective when deployed together. By taking a unified approach, IT teams can reduce unnecessarily stored (i.e. duplicated) data, make sure that only critical information is stored on high-tier storage, reduce overall storage footprint and increase agility.

Naturally, IT decision makers will need to weigh up other considerations, such as the cost and performance of storage hardware. Once teams have embarked on virtualisation, compression and automated tiering projects, both management costs and infrastructure costs will begin to decelerate. Only then can organisations begin to harness their data, rather than simply coping with it.

Surfing on the Data Flood

Data is undoubtedly one of the most valuable organisational assets – after its employees, of course. After all, most organisational data is useful information – it is simply over-replicated, inaccessible information. Through good management practises, this information can be turned into intelligence, supporting and driving the organisation.

This intelligence can be the differentiator which pushes an organisation ahead of its competition. By studying customer behaviour through CRM and sales data, by looking at employee activity patterns or marketing successes, companies can gain useful, actionable feedback on all of their past activities.

This allows them to refine their business strategies and change direction to shape their strategy, their customer engagement and new products and services, competing effectively and thriving in the market landscape. With this in mind, teams which do harness the power of the data deluge could find their entire organisation supercharged by information, rather than hampered by the flood.

SHARETweet about this on TwitterShare on LinkedInShare on FacebookShare on Google+Pin on PinterestDigg thisShare on RedditShare on TumblrShare on StumbleUponEmail this to someone

Bill McGloin, who started work at Computacenter in December 2001 as a technology leader for the services unit, is currently practice leader for storage and data optimisation.