About
Subscribe

The V-word

Virtualisation is it. Hip, hot and happening. And causing changes wherever it goes.

Samantha Perry
By Samantha Perry, co-founder of WomeninTechZA
Johannesburg, 06 Jul 2009

“Virtualisation is the highest-impact issue changing infrastructure and operations through 2012,” a Gartner report* issued early last year blithely states. “It will change how you manage, how and what you buy, how you deploy, how you plan and how you charge. It will also shake up licensing, pricing and component management.”

If it could wake you up gently and make you a cuppa, we may finally be onto something here... Seriously though, according to Gartner: “Infrastructure is on an inevitable shift from components that are physically integrated by vendors (for example, monolithic servers) or manually integrated by users to logically composed “fabrics” of computing, I/O and storage components. As virtualisation matures, the “next big thing” will be the composition and management of the virtualised resources.

“Storage has already been virtualised, but primarily within the scope of individual vendor architectures. Networking is also virtualised. The leading edge of this change is server virtualisation.

“Roughly 90% of the server market is composed of x86 architecture servers. Based on a traditional model of one application per server, roughly 80% to 90% of the x86 computing capacity is unused at any one time. This unused capacity needs to be managed. It takes up centre space and requires power and cooling. Virtualisation promises to unlock much of this under-utilised capacity.

“IT organisations are approaching server virtualisation as a cost-saving measure, and it is saving money. However, organisations that have a mature server virtualisation deployment in place are leveraging virtualisation for much more: faster deployments, reduced downtime, disaster recovery, variable usage accounting and usage chargeback, holistic capacity planning and more.”

Gartner notes that virtualisation decouples hardware and software, and this is what makes it significant. “A standard PC installation consists of a stack of multiple layers, the most important being hardware, the operating system and applications. Because of how these layers interact, the configuration of each is tightly coupled with the configuration of the layer below. This is the cause of much of the management complexity of today's PCs. Because hardware changes regularly, these changes have a geometric impact on everything above. Virtualisation breaks these dependencies, so the installation of each layer is independent of the configuration of the layer below. On the PC, it occurs at two levels: between hardware and the OS (machine virtualisation), and between the OS and applications (application virtualisation).

“The impact of virtualisation on the PC is the decoupling of the main functional layers. Application virtualisation is gaining considerable interest, because key market changes are taking place. This type of virtualisation is highly valuable for dealing with current PC management challenges, but it cannot help in the personal vs computing argument. Although more immediately accessible to you, its long-term impact will be far less significant than that of machine virtualisation. This is the technology that will really make personal computing more manageable, flexible and secure by enabling users to define multiple isolated footprints on the same device.”

Virtualisation breaks these dependencies.

Gartner

Virtualisation enables more than just decoupling, however. Several changes will make virtualisation critical to most enterprises during the next few years,” the report notes. “Processor capability has outpaced the performance requirements of many applications. Performance is relatively inexpensive and, therefore, the overhead of a virtualisation layer is not an issue. Although processing power is inexpensive (and getting less expensive), space, power, installation, integration and administration are not, and they cost the same whether a resource is 10% or 90% used. Additionally, Web access has changed workload levels from relatively predictable to spiky, forcing enterprises to overprovision. Virtualisation is not just about consolidation. By enabling alternative delivery models, new modes of providing functionality at each layer will evolve. By creating layers of abstraction, each layer can be managed relatively independently, and even owned by someone else (from streamed applications to software appliances to employee-owned PCs).”

It's also making waves in the software sphere. As the report bluntly states: “Virtualisation technology breaks most established software pricing and licensing models. The concept of fractional use of large resources, the ability to quickly change the amount of capacity available to software, the ability to move software from one resource to another easily, and the concept of an offline snapshot become more common. Finally, there is the idea that software could be packaged and delivered in a virtual machine format, ready to run, perhaps for a short period of time. None of these new concepts fits the common paradigm of pricing based on full use of a fixed asset.

“Virtualised licensing presents a major stumbling block to widespread adoption of virtualisation,” says Gartner. “The industry has been slow to address the problem. As vendors change their software pricing and associated licence provisions to accommodate virtual use, negotiators must plan to spend an increased amount of time per contract to understand the effect of such changes on their planned software use. Clients that do not diligently monitor the ways each of their vendors are responding to virtual-use issues are likely to experience significant increased cost and the unintended impairment of their current licence rights.”

* Report courtesy of Gartner; information sourced from: Virtualisation Changes Virtually Everything, Philip Dawson, Thomas J Bittman, 28 March 2008.

* Article first published on brainstorm.itweb.co.za

Share