Why the commoditisation of storage is good for the world

How big is the demand for data storage? This is a little bit like asking – how smart do you want to be. Consider just a couple of cases where today’s aging, lock in based models are standing in the way of human progress.

Start with telemetrics and related use cases. Read any tech journal today and you learn that smart buildings, smart machines, smart cars are smart because they are creating mass amounts of data about their performance, which is then analyzed for optimal performance. For example, you can see this article from the Economist in which apparently Rolls-Royce is shifting their model to one in which they sell airplane engines at a loss and then make up the money on services and support.

There’s just one problem with that model. If you store all the data created by airplane engines from flights each day (47k according to this real time tracker) you would have approximately 3,141 PB. At market prices for legacy storage of approximately $3,000 per TB (assuming they are trying to win the deal at a discount vs. their list prices) you would get to approximately $9,424 billion dollars a day. Plus, extending the back of the envelop further suggests you would exhaust the world-wide supply of enterprise class disk drives made each year in less than a week.

And yet – wouldn’t you like someone to be data mining the performance of aircraft engines? As an all too frequent flier, I know I would if only to assuage my guilt at the brown smoke they belch upon take off. Instead this data is simply deleted.

Take a look at the data requirements of:

  • Storing up to date MRIs on a reasonable percentage of the population; perhaps adding to that the several TB of data it takes for a single human genome. You quickly get to 1,000PBs which, again, at market prices is over $3 billion. So we don’t do it. The data is deleted and we’re that much less intelligent that we could be. Unless of course you are a NexentaStor customer and there are several from Oxford to Columbia to Max Plank and the University of Kiel using us for MRIs or genomic research today
  • Reading and doing data mining on smart meter data. If you want to store it over time you’re again into billions and billions of dollars. So the answer is to not store it all. Again we’re that much less able to be smart about power usage and our energy policy.
  • Video of all types, from the Dutch National Broadcasting (another NexentaStor user) through surveillance videos (yes, we have that type of customer too).
  • Mobile applications and the interactions of users with these applications. See for example the projected accelerating boom in data from the launch of 4G networks.
  • Modeling behaviors of complex systems including oil spills (thanks NOAA :)).

And on and on and on.

The good news is that the underlying components for storage are becoming commoditized. As mentioned in my recent post on storage industry drivers, this commoditization includes disk drives becoming less expensive per capacity, SSDs and DRAM improving on Moore’s law like rates of exponential improvement, and Intel’s processors dominating the market for storage processors and themselves improving at exponential rate.

The hardware itself is fairly standard across all storage vendors or will be shortly – the reason for the limited supply of storage is the massive mark-ups of the legacy providers. As we all know, the compute and SAS disk it takes for a TB of storage hardware is now ~$400 or less at scale. And yet the market price for enterprise class storage is rarely under $4,000.

By breaking the tight control of a handful of legacy storage vendors on the market – a control perpetuated by their use of lock-in based technologies and business models – we are improving the price performance in the upfront purchase of storage by as much as 10x; that advantage only grows over time as users data demand grows since the upgrade costs of legacy storage providers are often greater than the upfront costs they charge.

In short, as we step away from the “crack” pricing models of legacy storage vendors we will become better able to make sense of the world around us, i.e. more intelligent.

What other amazing opportunities for getting more intelligent are we missing because of the hold of legacy storage? Suggestions and comments are welcome. We’ll give the best suggestion some sort of award – maybe 4TB of free storage for example!

SHARETweet about this on TwitterShare on LinkedInShare on FacebookShare on Google+Pin on PinterestDigg thisShare on RedditShare on TumblrShare on StumbleUponEmail this to someone

Evan Powell is CEO at Nexenta. Evan is an entrepreneur with broad experience in building software and service companies. Most recently he was founding CEO and then VP of Marketing and Business Development at Clarus Systems, the leading provider of IP Communications management software to Global 2000 enterprises. Prior to founding Clarus Systems, Evan was an early employee at ThinkLink, where he was Director of Business Development. ThinkLink was one of the earliest providers of consumer VoIP and messaging services. Prior to ThinkLink Evan helped build Working Assets, one of the pioneers in socially responsible business which grew quickly during Evan's tenure into one of the largest telecom service resellers in the United States. Evan attended the European business school, IESE, and Williams College in Williamstown, Massachussetts.