Subscribe

The desktop virtualisation promise

The next generation of virtual desktop infrastructure is here, and aims to address some of the challenges.

Marcello Pompa
By Marcello Pompa, Argility new business development and key account manager.
Johannesburg, 15 Jul 2016

The purpose of this Industry Insight is not to define the virtual desktop infrastructure (VDI), but rather, to remind users of the promise of VDI, the challenges, and how these challenges have been overcome with VDI 2.0 - the next generation.

The promise of virtualising the desktop has never changed - essentially, it is the reduction of the cost of running solutions and ensuring the company's data - obviously a major asset - is protected.

In South Africa, the implementation of the POPI (Protection of Personal Information) Act will add additional pressure, but should ensure customer data resides in one location and is easily deleted if requested by a consumer. VDI allows this challenge to be addressed by ensuring that data resides in one location and not on remote devices, such as laptops, tablets and smartphones.

Sweating the assets

VDI extends the life of end-user devices such as the desktop, laptop, tablet, smartphone and BYOD devices, as applications that would normally execute on these devices now execute within the VDI central server or cloud environment - yes, over there somewhere!

Simply, all the users see on their device is the displayed output of the application and, where necessary, may provide some input via the keyboard on the device. Thus, devices no longer have to be 'heavy-duty', with huge amounts of processing speed and local disk storage; they can be 'thin client' devices such as a Chromebook - a lightweight and cheap device.

VDI allows for a reduction in licensing costs, as it is no longer necessary to have a licence on each device to enable a user to use an application. These are traditionally known as 'named user' licences. A company implementing VDI can assign a reduced number of 'floating-user' licences on its VDI infrastructure, and as users request to run an application, a floating licence is allocated for that session. Once the user finishes with the application, the floating licence is released back into the 'pool'. This makes sense, as not all users are running the same program at the same time.

As all applications reside in one environment, the process and associated cost of implementing new software upgrades is hugely reduced and simplified, as there is now no need to update software on every device - just do it in one place.

Challenges

This is where the issues really lie with regards to the slower than expected uptake of VDI over the past years. The challenges that have led to this reluctance to implement VDI have been, most notably, poor user experience. But this needs to be qualified.

Applications can be summarised into three classes:
* Client-server applications (15+ years old) - VDI worked well with these applications. Today, most VDI solutions do not support legacy applications that still run Windows 2003.
* Web applications (built in the last 10 years) - VDI works slower as they need an additional layer to make them functional.
* Multimedia real-time applications (built in the last few years) - VDI was challenged here, as solutions required the video player to be locally available on the device, hence going against the VDI principle.

Q&A

What happens when everything is offline?

This is an issue worth noting, but is becoming less so in today's connected world of alternative connectivity protocols like 3G or 4G.

High cost of VDI: Expensive storage required?

Yesterday was the desktop, today it's about the application.

The nature of desktop applications is they require high-speed disk access, and virtualising this can result in cost increases. The total cost of ownership (TCO) was increased in these circumstances. However, today there are data centre innovations available to address this challenge, but it remains an expensive proposition.

What if the data centre goes down - boot storm?

'Boot storm' is defined as the rebooting of hundreds of virtual desktops from within the VDI data-centre, which will take longer than rebooting each local laptop. The probability of this actually happening is low, but is a consideration that requires additional architectural changes to improve the boot process, and of course, with that, additional costs.

The cost of the thin client?

Unless a company is capable of leveraging existing end-point devices such as BYOD and laptops, there is a cost associated with providing these devices to the user, which can increase VDI expenses by a staggering 20% to 30%.

VDI 2.0 - the next generation

Yesterday was the desktop, today it's about the application. Like any technology, VDI is maturing and now into its next generation (VDI 2.0).

New products are becoming game-changers that encapsulate the complexity of implementing the prior VDI 1.0 architecture, which facilitates a huge reduction in implementation complexity together with the unnecessary requirement of underlying technologies required with VDI 1.0. The reduction in cost per user based on actual examples implemented is over 50%. One of the major advantages is the co-existence of applications from different operating systems, and even the co-existence of different versions of an application if this is a requirement.

HCI appliances

Hyper-converged infrastructure (HCI) is a method of converging all the components (disk, operating system, and applications) into a single appliance used for implementing the VDI environment. Today, VDI vendors are creating these appliances with the following major benefits:
* Reduced TCO; and
* Simplifying the implementation of VDI.

VDI 2.0 - the cloud and on-demand VDI environments:
When a user logs on to his VDI environment, a virtual machine environment is created on-demand. This has huge benefits from a TCO perspective and it is fast.

Decentralisation of VDI

In environments where there are a number of branch offices, the way to avoid the challenges of VDI 1.0 is to bring VDI to the branch office. This distributes the VDI environment and moves it as close as possible to the end point, while retaining management of it from one central point.

There are a number of vendors perusing this architecture to overcome some major application management and latency issues, including the ability to work offline, which is one of the major benefits of this architecture.

Maturing process of evaluating TCO of VDI

The key change to the approach has been to better understand companies' requirements and then create an appropriate proof of concept. By using the next generation of VDI technologies, the appropriate architecture can be deployed. This does differ from the initial approach with VDI 1.0, where it was assumed VDI was for all companies, and that with the existing technologies at the time, a 'one size fits all' approach was pervasive.

VDI 2.0 addresses the challenges identified in VDI 1.0, with reduced TCO and performance improvements as real and measurable business benefits. Today, vendors are providing innovative and less complex VDI solutions.

VDI 2.0 can be viewed as the coming of age of the virtualisation of the application infrastructure.

Share