Subscribe

SA scientists help build CERN grid

By Leon Engelbrecht, ITWeb senior writer
Johannesburg, 12 Jun 2007

Several South African scientists are involved with one of the planet's biggest science experiments - the quest for "Higgs particles", the sub-atomic bit assumed to be to blame for mass.

The experiment gets under way later this year in the large hadron collider (LHC) that is located 50m to 100m under a patch of farmland outside Geneva, Switzerland, and belongs to the European Organisation for Nuclear Research (CERN).

Scientists from the University of Cape Town are involved with data processing for the LHC's Alice (A Large Ion Collider Experiment), while the University of the Witwatersrand is supporting Atlas (A Toroidal LHC ApparatuS).

gLite toolkit

"The LHC experiments will come online at the end of the year and are foreseen to produce some of the biggest experimental data sets ever," says Bruce Becker, a researcher at UCT, CERN and an Italian university. CERN officials put the figure at 15 petabytes a year.

"The issue for the LHC was not only the size of the data, but the complexity - meaning a lot of processing has to be done, and very quickly, in order to get the raw data into a usable format for doing science and writing papers," he adds.

A key concern is guaranteeing the delivery of data on deadline. "The scientist who gets the data first and writes it up in a report wins the Nobel Prize," he notes. "The theory goes that every member institute of the collaboration should have equal access to the data... So, the LHC experiments definitely need a worldwide data processing and storage grid."

Becker says the LHC computing committee recognised this several years back and developed what eventually became a set of grid services and configuration packages, now known as "gLite". All of this is based on the Globus toolkit, SOAP and indirectly, OGSA.

The gLite toolkit is already installed in several sites in Europe. "The interesting thing is that it is quite often installed in centres where multi-disciplinary research is done with high-performance computing, since it fulfils 'federated computing' goals quite well," says Becker.

By this he means that once an agreement of what level of service a certain group or project will pay for, gLite can implement the agreement.

"For example, at the lab at which I [also] work in Cagliari (in Italy), there is an astronomy group, a materials research group, and the high-energy nuclear physics group among others, all using the same computing facility, belonging to different virtual organisations, managed by gLite," says Becker.

"So, the moral of the story is, if a certain party has a well-defined computing goal, it can create a virtual organisation and use gLite to manage its computing needs."

Spin-off

Becker's UCT CERN Research centre is working closely with the Centre for High-Performance Computing. It is also forging links with the iThemba Laboratory for Accelerator-Based Science nuclear theory group, the Wits physics department and UCT, as well as the University of Pretoria's computer science departments.

He expects this research to spread into the wider IT and business world, and is already seeing that happen in the bio-medical and earth sciences fields.

In addition, the technology behind the science, gLite, is finding commercial application. IBM and Sun have started to deploy grid technology in some of their product ranges, he points out.

Becker says banks, insurers and multinationals, especially in Europe, are using a grid approach to data processing, as are mobile telephone providers.

Related stories:
SA drafts IP protection law
Academia gets high-performance computing
Gartner: Virtualisation has security risks
High-performance centre under construction

Share