Big Data Replaces The Cutting Room Floor

Big Data Replaces The Cutting Room Floor

Sometime in 1888, 125 years ago, an Englishman called John Carbutt invented photographic Celluloid. Celluloid itself had been invented some years earlier in 1862, but it’s use in photography only came after Carbutt had standardised a process of coating celluloid with a photosensitive emulsion of cellulose nitrate. For almost 100 years, celluloid in one form or another was used Celluloid has it’s problems – it deteriorates, it is highly flammable and, as the BBC found out a few years ago, it is easily lost!

Today, celluloid is almost never used for filming. Filming is almost entirely digital with film, (should we still call it film?), being recorded directly onto solid state storage and then uploaded onto servers where it can be manipulated, edited and crafted into the final product that we see. There is no such thing as the ‘cutting room floor’, more likely there might be some bits of film that end up in the ‘recycle or trash bin’, but even this is unlikely as most producers want to retain everything that has been shot just in case.

The sheer volume of data that is now being created by the TV and Film industry is breath taking. Take, for example, TV. Assuming that the footage is filmed at 50 megabits per second, (broadcast quality according to our friends at Aframe, the leading cloud-based film collaboration provider), then filming 1 minute of video will generate 396 megabytes of data. 1 Hour of film will generate 23.2 gigabytes of data.

There are 17 News channels in the UK alone that broadcast 24 hours a day. All of this TV must be recorded and stored. This means TV News alone generates 9,465 gigabytes of data per day. As News is a 365 day activity, this equates to 3,454,944 gigabytes of data per year.

A reasonable estimate of the amount of footage recorded for an hour of TV is a 70:1 ratio according to David Peto, CEO of Aframe. A staggering 10,000 hours of TV transits through BT Tower each week. Using these numbers, we can calculate that 1,624,000 gigabytes of data is generated and stored each week. That’s 592,760,000 gigabytes per year, (or 578,867 terabytes).

Assuming that this TV is to be stored for the next 50 years, (as much of it will), there is likely to be 29,638,000,000 gigabytes, (289,433,593 terabytes), of data that must be stored somewhere! And all this assumes that just one camera is used – the truth is that in most cases more than one camera will be used, so the number is actually much, much bigger.

Movies, or course, are likely to create even more data. An average film length is two and a half hours and most scenes will be shot with multiple cameras. Taking what appears to be the minimum of 3 cameras to shoot a scene and using the 70:1 ratio, this would mean that 12,180 gigabytes of data would be generated for a single film. According to the MPAA, a total of 610 movies were released in 2011, (the latest figures available), which would equate to the movie industry generating 7,429,800 gigabytes of data each year.

The conclusion, therefore, is that as celluloid is replaced by digital recording, TV and film will generate huge quantities of data that will need to be stored. The days of storing, (or losing), hundreds of rolls of film are gone, but they have been replaced by data. This data must be stored somewhere and it must be available at any time the producers, editors and directors want it. The answer, of course, is that it must be stored in data centres.

But not every data centre will be geared up for this kind of storage, so the media companies must be selective. The data is growing at such an exponential rate that room for growth is an absolute imperative. Also critical is the ability to deploy more storage quickly – no media company will want, or afford, to deploy storage that isn’t used immediately. The data centre must be ready to use straight away with no waiting for fit-out to be completed.

So how do we put these numbers in perspective? If an average PC has a hard disk size of 500 gigabytes and if the data was to be stored on a PC, using these numbers, 1,200,379 PCs would be needed every year just to store the data that is generated. Now that is BIG data!

Alex Rabbetts

Alex Rabbetts is founder and CEO of MigSolv. The company is a long established data centre consultancy and, more recently, collocation operator. Alex has over 25 years of experience in the data centre industry having designed data centres for many organisations across Europe and beyond. He has also been responsible for managing the build of over 1,500,000 square feet of data centre space in a number of European countries. Alex has a wealth of experience in the relocation of data centres, hence the original name of the company, Migration Solutions.