They call it The Grid, or computing on-demand, or organic IT, or unbreakable IT. Whatever the terminology, utility computing is based on the idea of a virtual network of computing resources interconnected across hardware platforms and operating systems. These resources, in theory, should be shared, self-managed, efficient and always on.
But definitions are not clean cut. "Utility computing is often confused with grid computing, with on-demand computing, with service-level computing," says Jason Phippen, a Veritas Software director for the EMEA region.
As long as there`s uncertainty about the meaning of these two related, but separate, concepts - grid computing and utility computing - customer adoption will be slow. While grid computing is about harnessing the combined power of many computers into a single virtual network, utility computing enables users to focus on their business without worrying about running out of computing power.
Today`s technologies that are enabling on-demand or utility-based computing are Web services, blade servers, storage virtualisation and network route-optimisation.
"Instead of making capital investments in under-utilised servers, databases, applications, storage and network capacity to meet occasional maximum demand, businesses are looking to cut costs by paying only for the ICT resources actually consumed," explains Cor Fawell, executive consultant for strategic solutions at Comparex Africa.
Forrester research predicts that this approach could help users cut IT costs in half over the next five years. But the research firm couples the promise with a warning about the problems dogging the grid computing concept, such as "too many moving parts, too hard to monitor, not enough standards and sometimes too many incompatible standards".
Utility or futility?
Big platform vendors such as IBM, Sun, Oracle and HP are all promoting solutions for grid and utility computing. But in reality, warns US-based Nucleus Research, "despite grandiose press releases, grid computing...is very much in its infancy".
"Grid computing is at this point little more than a marketing message from the major vendors looking to sell server management software," says one Nucleus report.
Grid computing is a new services-oriented architecture that embraces heterogeneous systems and involves distributed computing via open standards.
Wayne Hull, systems infrastructure executive, IBM
Of the 38 Fortune 500 IT managers Nucleus interviewed, none are using grid computing technologies and 80% have no plans to implement it in the next six months. Only 11% of the interviewees knew what problems they`d use grid computing to solve.
In December, the International Data Corporation (IDC) predicted that during 2004 utility computing would rather be "futility computing" - a phrase coined by Frank Gens, senior VP of research at the firm.
"Although a lot of marketing muscle will continue to promote this concept, real investments will be modest, and largely focused on infrastructure management, not on solving business problems," the report stated. "In the meantime, confusion about the meaning and value of utility computing will persist."
But IDC expects IBM, Sun and HP to present better-defined utility computing road maps by the end of 2004.
Open standards the key
IBM`s grid computing platform is generally considered to be at the highest level of open standards compliance. Forrester, for instance, says IBM has the most realistic concept of the grid and is telling the truth about the complexity of the challenge of getting different technology stacks to collaborate over the grid "on demand", and the need for work on standards such as the Globus toolkit and a next generation of Web services.
"Operability is one of the key issues," says Dave Botha, an executive at IBM SA. "If we don`t have open standards then the whole promise of on-demand computing doesn`t come to bear. IBM used to be a proprietary company and we turned 180 degrees; we are now the most open company, we believe, in terms of open standards. A lot of other vendors are coming to the realisation they have to adopt open standards because of customer pressure."
<B>Long way to go</B>
Meta Group warns that utility computing models will not become real (ie, technically usable, financially viable, widely used) until after 2007, but vendors will attempt to create and repackage existing offerings, such as partial outsourcing and buy/lease models, as pseudo-utility offerings.
"Users will face tempting promises of instantaneous power, seamless integration and access to new technologies. However, users must be cautious of the limitations of current models as well as the architectural integration and process change efforts required," says Meta Group.
Wayne Hull, an IBM systems and infrastructure executive for SA and central Africa, explains that grid computing is one of the four foundation technologies for the company`s e-business on-demand strategy. The other three are e-utilities, autonomic computing and Web services.
"IBM views the grid as the foundation on which you build workload management and consolidation; the network on which to federate and distribute data globally; the fabric that ties together highly available resilient computing infrastructure; and a standard for manageability of that infrastructure."
<B>The next big thing: The Grid</B>
A number of trends are promoting the concept of grid computing:
* Blade servers offer the most cost-effective form of commodity clusters and are the most likely future architecture of computing.
* Linux, the commodity operating system, works well on commodity clusters.
* Virtualisation, the ability to access a diverse pool of resources and thus create a "single system" illusion.
* Support of major vendors, such as IBM, HP, Sun, Dell, Oracle and CA.
Source: Oracle and The Grid, An Oracle White Paper, November 2002
Despite very slow adoption, Hull believes grid computing has passed its infancy stage, pointing to a little-known fact that IBM has over 100 grid clients up and running worldwide. "The reason the world will be moving to grid computing is because it enables businesses to efficiently consolidate, pool, share and manage IT resources," he says.
"Very simply put, grid computing is a new services-oriented architecture that embraces heterogeneous systems and involves distributed computing via open standards. IT departments can aggregate disparate technology capabilities - computing resources, data storage, filing systems - to create unified systems. Grid computing will foster the creation of cost-effective, resilient IT infrastructure that is adaptable to change."
Hull concedes though that there is a lot of scepticism around IT due to the legacy of "big" IT concepts and projects that have been oversold and have under-delivered. "Business executives are looking for much more variability in the way they deploy IT rather than big fixed costs with long-term delivery benefits. They want to see the results much more short-term. Grid computing has to deliver adaptability and flexibility."
Software as an enabler
Veritas`s Phippen says the storage management software company`s strategy is not to provide utility computing, but to enable utility computing through software. "When you look at some hardware vendors` utility computing strategies, what really comes across is almost a case of having to throw away what you`ve got - almost putting in an entirely new utility data centre, which will then give you this fluid data infrastructure, enable your IT to adapt. In reality, no customer is going to throw out their data centre."
Veritas enables customers to move to the concept of a computing utility, by building on the technology infrastructure they have in place.
"We recognise that customer environments are heterogeneous and they are going to stay heterogeneous," says Phippen. "Customers want the ability to leverage investments they`ve already made and flexibility in how they make future purchasing decisions.
"Our value proposition for utility computing is to enable customers to reduce capital and operational expenditure, while maintaining and enhancing service levels. To help them to move away from a rigid IT environment to a more flexible one - and we do this through software."
Wait and see
Oracle SA solutions manager Mohamed Cassoojee says his company`s approach is also to help customers leverage investment in existing infrastructure, to pool resources and share them across the enterprise.
"We developed a set of software that can manage all the provisioning and that creates a virtualised vast computer resource that can be used as and when required."
<B>Open standards</B>
The Globus alliance conducts research to create fundamental technologies behind The Grid. The aim of the alliance is to design, develop and support the Globus toolkit, which is an open source software toolkit used for building grids.
The toolkit maintains growth through this open source strategy - similar to Linux operating systems. It includes software for security, information infrastructure, resource and data management, communication, fault detection and portability.
Simply put, the toolkit allows users to access remote resources as if they were in their own machine room, while at the same time preserving local control over who can use resources and when.
The Globus alliance is a non-profit initiative that has made a significant impact on the commercial environment. Companies like Fujitsu, HP, IBM and Oracle have all pursued Grid strategies based on the Globus toolkit since 2000. - By Damian Clarkson
Some key technologies underpin this solution, such as low cost blade servers and fast interconnect technologies between computers that let users share information. "With the convergence of these two technologies, we could develop software that could manage this hardware and make the investment in hardware a lot lower."
Cassoojee expects user adoption to pick up on the back of low-cost computing delivery.
"Economics will revolutionise it: as people see the cost of infrastructure is coming down, that they can use low cost commodity components and get the same quality of service, we feel adoption will be much higher."
IBM`s Hull says early adopters are mostly in the financial services industry. "IBM SA will be focusing on the banking and the government sectors. There are significant opportunities for grid computing application as banks move to be more customer-centric. As for government, the challenge is to leverage the concept of e-government across the grid to give multiple departments flexibility and adaptability."
But do customers understand the value proposition of grid computing? "I don`t think they do at all," says Hull. "IT people understand that if you pool IT capability, you can get better leverage. But business people are not interested in the `how`, they are interested in the results."
And there is a long way to go before the result of grid computing are defined and proven. In the meantime, the industry has this lofty promise, this "next big thing" of computing power available on tap to strive for.
Share