Subscribe

Power of one

Lezette Engelbrecht
By Lezette Engelbrecht, ITWeb online features editor
Johannesburg, 20 Apr 2010

In 2008, FirstRand Group, one of SA's largest banking institutions, faced a problem that had become too big to ignore: multiple operational inefficiencies in four different data centres, serving four separate divisions of the business, spread across three different buildings.

“The data centres had grown rapidly and were barely held together in a bubblegum and shoestring scenario. Added to this, the centres were now competing with valuable office space,” says Nick Smith, property infrastructure manager at FirstRand. “The old data centres were dinosaurs that sucked electricity, water, and generated a lot of heat and noise.”

Three of the server rooms had not been designed to function as data centres, resulting in significant efficiency, cooling and power consumption flaws. This meant operational costs were growing uncontrollably and the risk of systems failure was unacceptably high.

“It was a disaster waiting to happen,” notes Smith, who says mitigation of risk was a chief concern, due to Reserve Bank and South African Revenue Service demands around continuity of service. “If the data system doesn't work, we're hugely exposed.”

So when it became clear something had to be done, FirstRand took the opportunity to green its data centre operations as well. “There was a definite sentiment that the centres could be a lot greener in terms of cost and maintenance,” explains Smith. “Going forward, we wanted things to be more efficient.”

The group approached Dimension Data Advanced Infrastructure Solutions, which, in partnership with APC by Schneider Electric, planned and implemented a new, consolidated green data centre with room for business expansion.

According to DiData Advanced Infrastructure GM, Nicholas Shaw, it first gathered information on various servers and their possible locations, the number of transformers feeding the building, and the possibility of tapping off chilled water from the existing chiller units.

Step-by-step

The project, delivered in three stages, came with various challenges. First was the physical location - a basement parking lot of 500 square metres. “From a construction point of view, space was very limited,” explains Dewald Booysen, DiData Advanced Infrastructure CTO. “So we installed a high-density rack layout, which enabled us to save space and consolidate.”

According to Shaw, this helped to trim down the original count of more than 260 racks in total, to 113 racks, with proper power, cooling, connectivity and manageability, together in a Tier 4-type topology. “In the banking sector, you can't afford downtime, which is why this data centre is of the Tier 4-type,” he notes.

There's been a big change in focus, from capex to opex.

Dewald Booysen, DiData Advanced Infrastructure CTO

The company implemented scalable power and cooling systems, using close coupled cooling technology, high density computing and security and access control. It also put in APC's InfraStruXure Central software, which provided a management function for monitoring the data centre's entire physical infrastructure.

Booysen says the different divisions of the group had different requirements, with a need to measure and monitor the various modules. “While the separate centres were consolidated into one, there was a need for independent control and security.”

Shaw points out that a big part of going green is controlling energy consumption, and measuring input power versus output power, to determine power utilisation efficiency. “Traditionally, air conditioning uses a lot of power, but the APC technology reduced power consumption on the cooling side, states Shaw.

Neill Schreiber, APC sales manager for southern Africa, says techniques like in-row cooling and hot aisle containment systems were used to provide efficient and innovative energy reduction solutions.

Numbers game

Measurement is another key element to running efficient systems, stresses Booysen, citing the old adage 'what you don't measure, you can't manage'. “It's all very well saying you have an energy-efficient data centre, but it's no use if you can't measure actual power utilisation,” he says.

According to Schreiber, the InfraStruXure Central software allows management teams to measure at any time how much energy is being used by the server racks and other components in the system. This intelligence can also be used when adding equipment in future, says Schreiber, as it will alert administrators to power and cooling constraints.

“So, if you want to add a server in a certain area, the system can tell you if there is adequate power and cooling available for this additional equipment. It thinks for you and advises about plans,” explains Schreiber.

Shaw says the project also focused on how the company could expand its data centre as needs grow in the future. “There's 30% capacity for future growth and this doesn't impact current power and cooling requirements.” Two multi-module redundant Galaxy 7000 UPS systems were installed to allow for uninterrupted future power upgrades to meet expected growth.

“There's lots of emphasis on scalability, and putting in just enough power and cooling for your needs today,” says Schreiber, adding that companies can then scale up when they need to, versus investing in huge infrastructure outlay upfront.

Mind shift

A big change has occurred in the way companies approach IT investment, notes Booysen, with focus shifting from capital costs to operational expenses. “With the Eskom rate hikes, the focus has changed from capex to opex. There's a lot more emphasis on investing in efficient operations, which pays for itself with the savings it brings.

“From an environmental perspective, and from a savings perspective, keeping running and maintenance costs low makes sense,” explains Booysen.

He adds that there's huge ignorance in the marketplace regarding the building of data centres; they have to be designed with every piece of equipment in mind. “In the past, the various disciplines in the server room concentrated on individual sections, creating gaps,” says Booysen. “Going forward, when data centres are designed, the power, cooling, and security elements will all be designed from a holistic perspective. If it's done upfront and done right, all components from day one will ensure the data centre meets all the requirements of the business.”

Network design is key, he says. All active devices have to be connected and part of the design. “You have to look at things holistically, and not have a cooling specialist here, and an electrician there. Lots of systems are underutilised because they're not designed with an integrated approach.”

Reaping the benefits

Smith says there's been a significant drop in overall electricity consumption in the building where the data centres used to be. He adds, however, that it's difficult to draw a baseline for energy use because the four other data centres were part of the office space and it's tricky to try and separate what they used in terms of power.

The group has seen greater efficiencies in cooling, power and energy consumption, which has reduced costs significantly, says Smith. Nobody has experienced any hiccups, electrical failures, leaks or system failures.

“We're learning fast; we used to do things our own way, but now it's a controlled environment with control systems in place. There are two full-time data centre managers who look at security, maintenance, and keep everything online.”

Smith has sound advice for those thinking of going the green route. “With a green data centre, the best way to go about it is with lots and lots of planning, and to surround yourself with knowledgeable consultants who really know their stuff, and can contribute to the success of the project.”

Share