What’s Next For Virtualisation?

Virtualisation

Based on the media headlines, the activity from vendors and the advice from analysts, you’d be forgiven for thinking that virtualisation is a done deal. However, research shows that this is not the case.

In the first round of research, 25% of respondents had less than 10% of their server hardware virtualised, and only 30% had more than half their hardware virtualised. Ten months on, and it is apparent that there is a lot more virtualisation taking place as the number of respondents stating less than 10% of servers being virtualised dropped to 13% – but those with more than 50% virtualised was still only 33%. It would seem that virtualisation is still not moving at the speed many would like to believe.

Why is this – and what does it mean to the future for virtualisation?

For many, virtualisation is something that sounds too good to be true, with the promise of consolidation down to levels of around 20% of existing hardware and the consequent lowering of energy bills, a reduced need for system administrators, savings on licensing and a reduced need for new data centre space.

However, whereas some low-end, non-mission critical services can be easily moved over to a virtualised environment, many organisations have baulked at the thought of moving enterprise applications such as SAP or Oracle over onto virtual hardware, either due to the perceived complexity and possible impact on the organisation – or purely down to a feeling of needing to wait until it has been proven elsewhere.

Others are waiting for the financial situation to sort itself a little. Although virtualisation promises on-going savings, there are pretty large upfront costs involved in carrying out a full audit of what is already there, planning for broad virtualisation adoption and then carrying out the project. Many just prefer to wait, as existing systems are working as far as the organisation is concerned.

Others are waiting for the technical side of things to stabilise. After a few years of whatever year it happened to be being “the year of virtualisation”, everything now seems to be going towards “year of the cloud”. As virtualisation is a key underpinning for cloud, many have chosen to wait until they have seen successful implementations of cloud elsewhere and the concept becomes more real to them before embarking on full virtualisation. There is also the need to fully understand how the different flavours of cloud can be used – and so just where virtualisation fits in the private/public cloud mix.

However, there are drivers beyond data centre efficiency that means virtualisation has to be looked at as being part and parcel of the future for any organisation. One is the explosion in different types and sizes of user end points combined with the consumerisation of how these devices are finding their way into the organisation. This has led to a need for some type of control on device usage to be exerted by the organisation.

It has been shown many times over that trying to dictate what types of devices users can and cannot use doesn’t work, so many are now looking to run virtual desktops situated in a controlled data centre where an access device can be used to allow an end-user to work without the risks of uncontrolled and unmanaged devices wreaking havoc and without device loss or theft being a major issue.

Another driver is that more software is being provided as “virtual appliances” that are easier to set up and run than traditional install kits. Such appliances, have become commonplace for IT security, and are increasingly seen in other areas such as email, data bases and so on. Virtual appliances have the benefit of being easily deployable while also being able to cope with peaks and troughs in workload; in a well-implemented virtualised environment resources can be “borrowed” as required and new appliance invoked as necessary.

However, the biggest driver for server virtualisation will be cloud computing – despite all the misunderstandings and misperceptions that are there just now. At the moment, respondents to the same Oracle/Quocirca research were still not sure about cloud, but there is a big change in how cloud is viewed.

In the first cycle of the research, 13% stated that cloud had no part in the future of their organisation, while this had fallen to 6% ten months on. Those seeing cloud as a complete game changer had risen from 12% to 21%. This growing interest in cloud will undoubtedly drive the implementation of virtualisation to support cloud, even if 2012 is not quite the “year of the cloud” many hope for.

Cloud will also drive other forms of virtualisation, as cloud computing is not just dependent on servers. Storage virtualisation is coming of age, with different approaches becoming apparent from the likes of EMC, HDS, NetApp and Dell, now that it has completed the acquisition of various storage companies. Relatively new vendors such as Coraid, Fusion-io and Egenera are making a play to become cloud-specialist storage vendors, or at least to be able to provide differentiated cloud storage plays to plug holes in the existing storage vendors’ portfoilios.

The concept of the storage area network is beginning to fade – the need for different types of data to be stored on different types of device in different ways but with full access across all the data (rather than each storage device being a data silo) so that “big data” can be embraced is driving an increasing need for virtualisation at the storage layer.

Similarly, network virtualisation, using devices such as Cisco’s Nexus, IBM’s Blade Systems and Dell Force 10 top of rack (ToR) highly virtualised switches enables networks to be optimised for different traffic types while flattening network fabrics and so reducing latency introduced through multi-level hierarchical network approaches. In conjunction with storage, the network is having to struggle with high growth driven not only by new data types, such as voice and video, but also in the need to be able to prioritise streams in real time to meet the needs of the business.

What this really means is that the future for virtualisation is good – but it remains complex at a component level and will require better messaging and education from vendors and influencers to help organisations move towards the promised land in a fully managed manner where the business and cost benefits are recognisable.

This will require all the areas of a virtual environment to be considered in the round – just talking about one type of virtualisation in the absence of others will not help end user organisations get the message. To this end, we’re beginning to see a growth in pre-configured systems from vendors. IBM, HP, Oracle, SGI, Dell and others are already providing fully-engineered “modules” for building highly virtualised environments.

These modules can be anything from a single equipment rack pre-configured with servers, storage and network equipment in an optimised manner through full data centre rows to complete systems held within standard shipping containers. By taking such an approach, the complexities and possible pitfalls of a build-it-yourself system are avoided.

For the majority, the future of virtualisation will be a hybrid approach – a mix of some older applications remaining on dedicated physical servers due to issues with moving them to virtualised environments; some applications on dedicated but virtualised platforms where higher levels of availability can be guaranteed along with better overall system utilisation; certain applications, services and functions being provided from highly virtualised private clouds and other services and functions being served from external public cloud systems.

However, it does mean that organisations should ensure that they do understand at least the basics of the different approaches to virtualisation, the different types of workload (application, service and function) that they could be looking at putting into virtual environments – and what cloud means to their organisation.

SHARETweet about this on TwitterShare on LinkedInShare on FacebookShare on Google+Pin on PinterestDigg thisShare on RedditShare on TumblrShare on StumbleUponEmail this to someone

Clive Longbottom is founder of Quocirca and is a highly respected and globally recognised industry analyst, covering a range of business and technology areas. Clive’s primary coverage area is business process facilitation. Clive has been an ITC industry analyst for over 15 years. Clive has worked with a range of large and small analyst companies, including META Group (now Gartner) as VP Europe. Clive has a B.Sc. (Hons) in Chemical Engineering from the University of Aston in the UK.