Subscribe

Strategies for successful virtualisation


Johannesburg, 15 Apr 2009

Virtualisation gives organisations the power to move resources in an instant, but can pose various problems and lead to increased costs if implemented incorrectly.

So says Justin Steinman, VP of solution and product marketing at Novell, who believes following certain steps will allow organisations to successfully deploy virtualisation in a mixed data centre environment.

Steinman suggests that organisations move slowly and take small steps first, monitoring performance and management issues before moving to more complex virtualisation opportunities.

“Evaluate current server workloads and determine which applications should be virtualised. The best candidates for virtualisation are applications running on Web, infrastructure or application servers. Also gather detailed data for all hardware and software assets across the data centre and analyse workload utilisation to develop optimal server consolidation plans,” he says.

Steinman adds that businesses should analyse complete workload life cycles and record utilisation data over a significant time period, to ensure all ebbs and flows in resource utilisation are captured accurately.

“You can then develop a workload profile that provides a clear picture of server utilisation trends and anomalies in CPU, disk, memory, and network utilisation rates.

“Also use 'what-if' modelling to find the best combination of hardware and virtual hosts to maximise utilisation, avoid resource contention, and forecast future workloads,” he says.

Managing act

Steinman suggests checking management tools and says organisations who want to create and maintain a dynamic data centre, where they can easily move between physical and virtual machines, should utilise one management tool.

“Use management tools to gain a clear view of the total virtual machines on each server, what workloads are being added, and at what rate. This vastly improves resource planning. Moreover, it gives companies the ability to track workloads and allocate IT charges to business units based on actual disk, CPU, and network usage,” he adds.

According to Steinman, it's important to test virtualised servers before going live to ensure application performance doesn't slow down due to combining too many resource-intensive applications on the same physical server.

“This is where workload migration tools come in handy to decouple data, applications, and operating systems from the underlying hardware and stream them to any physical or virtual platform.”

Steinman also advises taking advantage of dynamic provisioning, like adjusting processing power on demand. “You could set a threshold so that when usage of a critical application hits 80%, a new server is automatically brought online. This type of intelligent resource management gives organisations the ability to design data centres that respond to their business needs.”

Leveraging on disaster recovery scenarios is Steinman's last recommendation. He says employing virtualisation solutions for business continuity gives organisations a way to get more out of their virtualisation investment.

“All server workloads, whether they're on physical or virtual machines, can be duplicated to virtual machine backups and if an outage occurs, the workloads simply failover automatically to the duplicate virtual machines,” he concludes.

Related stories:
Virtualisation, not consolidation
IT-as-a-service model is here
Virtual desktop market to surpass $65bn

Share