Subscribe

Rise of the super machines

Lezette Engelbrecht
By Lezette Engelbrecht, ITWeb online features editor
Johannesburg, 17 May 2011

When it comes to supercomputer specs, the scale is so vast one really needs some concrete form of comparison, or risk losing all sense of comprehension in a blur of zeroes. So here's a thought: if a person with a pocket calculator had to do 30 trillion calculations, it would take three million years of nonstop, 24/7 button-pushing... and very fast fingers.

Or, one could stop by the Centre for High-Performance Computing (CHPC) in Cape Town, and get it done in, well, a second. Mira, the supercomputer IBM is building for the US Department of Energy, won't even take a full 'Mississippi'. It will perform more than 10 000 trillion calculations per second - four times more powerful than China's Tianhe-1A, currently considered the fastest.

It's just as well these mega minds are getting faster by the year, because the kinds of problems they're working on - climate predictions, energy security, medical uncertainties - are growing at an equally cracking rate.

Dr Happy Sithole, director of the CHPC, which houses the largest supercomputer on the continent, says while supercomputers may seem like the stuff of sci-fi legend, they're actually being used for practical, everyday applications.

“Take a simple example - crash tests. Usually, physical prototypes need to be individually built and tested for safety, which is time- and cost-intensive,” explains Sithole. “But computer simulations have become so accurate that one can run tests of virtual models, which cuts down the time needed to get to a solution.”

Not only are computers getting faster, adds Sithole, but the methods of using them are improving. “South African researchers are getting access to tools and software that helps them to make the country more competitive.”

Researchers at the University of Limpopo, for example, are using the CHPC's supercomputing facilities to research ways of making more affordable, energy-efficient lithium-ion batteries.

This could help place the country on the map in terms of advanced battery technology, says Sithole, and further the development of electric cars and other clean technologies.

Apart from finding new forms of energy, researchers are also analysing the climatic effects of previous methods. The recent devastation in Japan served as a grim reminder of the importance of accurate climate data, and its role in disaster management - areas in which SA is ill-equipped, says Sithole.

He explains that while many other countries know how they'll respond to climate changes, if a disaster takes place in SA, there's little modelling available for effective contingency planning.

“It goes beyond just knowing about the disaster - we should be able to mitigate the effects, and organise evacuations if necessary.”

Dr Francois Engelbrecht and Mary Jane Bopape, from the CSIR's Natural Resources and Environment division, are using supercomputers to formulate climate and weather predictions over various time scales. The CSIR's Atmospheric Modelling Strategic Initiative aims to develop simulation and forecast systems that can provide seasonal forecasts and climate change projections over decades.

”It may be said that the current CHPC computational infrastructure and data storage support during operational model integration has made feasible the dawn of a new era in climate simulation in SA," say the researchers.

According to Sithole, the CHPC's supercomputing facilities - a Blue Gene/P, Linux-based cluster platform, and Sun hybrid system - enable local researchers to use computer simulations for experiments not previously possible, such as those that are highly expensive or dangerous.

Projects that get allocated time on the supercomputers are aimed at addressing SA's developmental challenges, and aligned with national systems of innovation, including energy security, climate change, human language technologies, astronomy, and health, he adds.

New frontiers

While SA has set the pace, Sithole says there have been positive developments in other African countries, such as Egypt and Tanzania, where small systems are being built. He adds that the main message emerging from a recent SADC HPC workshop was the eagerness of other countries to develop HPC capacities, and that they were looking to SA for guidance on these initiatives.

Sithole is inspired by China's approach to the computing field, as its Top 500 Tinahe-1A machine incorporates nationally-developed technology. He believes the supercomputing field could generate new types of jobs in SA and other African countries.

“Looking at HPC now, it's expensive, but we can reduce costs and open up the market to the rest of Africa by improving applications and driving innovation.”

It's opened up a whole new field of research.

Daniel Cunnama, PhD student, UWC

In SA, he says, users sometimes lack sufficient training to perform large computations, which requires users to understand their applications and optimise them for large HPC architecture.

“This problem is primarily due to the lack of access to entry-level systems for most users, who then reach the national facility without prior knowledge of the scalability of their codes.”

According to Sithole, the country needs to develop institutional mid-range systems - typically a couple of hundred cores - which users can start performing their applications on, before they move to a national facility. Then they'll be better prepared to start focusing on applications that require in order of hundreds to thousands of cores, he explains.

“We need more young people who think differently in computing.”

Sithole believes promoting awareness of the field among young people, as well as at industry level, will help update its image and foster interest.

“With any new career option, people look at role models within that career. We need to show this career as something sustainable - with resources not just being adopted in universities, but also in industry. If people start seeing supercomputing being used in industry, they know they can get a good job.”

Galaxy gazing

Another sector that's been appealing to young talent is the Square Kilometre Array (SKA), set to be the world's most powerful telescope. Preparations in the form of the precursor Karoo Array Telescope, MeerKAT, has gained much interest from the research community and general public, as SA bids against Australia to be the SKA host country.

Whether SA wins the bid or not, the telescope facilities being built will need to process and disseminate data throughout SA, the continent, and the rest of the world - all of which will require significant computing resources.

Sithole notes that the SKA brings challenges to both the computer and energy supply industries. “The significant requirements brought about by its remote location (semi-desert conditions) and large amounts of data that is captured and require pre-processing on-site, require innovation on how to power and cool computer equipment and the telescopes...”.

There's also the issue of transporting big chunks of data, not only to different centres in the country, but around the globe.

“We need to think about connectivity to such a remote area, and electricity grid modification,” says Sithole. “Alternative methods of energy supply and cooling of computers is already pushed beyond boundaries by this challenge.”

He adds that the technologies developed should extend to applications beyond astronomy, to justify the economy of scale.

The bid is also heavily focused on collaboration outside SA's borders, with the local proposal emphasising that other African countries will benefit from the research capacity, skills demand and international interest the SKA will bring.

Professor Catherine Cress, from the Department of Physics at the University of the Western Cape (UWC), explains the value of using simulations to conduct research in computational astrophysics and cosmology:

“In order to understand the universe as a whole, we need to be able to test theories with observations. Computational cosmology allows us to do this by simulating a theoretical universe, based on theoretical principles and then comparing the results with the observable universe.”

The field of computational cosmology is relatively young, notes Cress, as this type of research was not possible a few decades ago due to lack of computational resources.

When computers first began to be used for scientific research, the simulations that were possible only consisted of a few hundred particles. Nowadays, due the massive advances in computational power, our simulations consist of many billion particles and contain complex physics.”

“It's opened up a whole new field of research,” says Daniel Cunnama, a PhD student at the UWC, who is working on large-scale cosmological simulations, being run at the CHPC.

“The advantage is that simulations allow us to test out theories against the observable universe.”

We need more young people who think differently in computing.

Dr Happy Sithole, director, CHPC

His overall hope for the PhD project is that it will help to predict the kind of experiments possible with the SKA. “The SKA means a great deal for the country - a lot of effort and money is going into it, and it's going to develop a lot of expertise.”

Not only will the telescope require massive computational resources, but the machines will need to communicate very fast, which means high-speed, high-bandwidth connectivity.

To enable this speedy transfer, the CHPC has been linked to the South African National Research Network to ensure fast, uncapped broadband capacity.

“Access to supercomputing has allowed South African researchers to get involved in a new and growing field,” says Cunnama. He adds that the field is constantly advancing, with simulations done 10 years ago now hopelessly out of date.

“It's a rapidly moving field and it's good for SA to be on board and growing expertise in this field.”

Bigger brains

According to Sithole, there are plans to continue upgrading the supercomputing facilities at the CHPC, as the nature of computer technologies means systems generally become obsolete after six years.

“A sustainable way of ensuring relevance of technology without major costs is to develop a roadmap that is sensitive to these developments,” he says.

Sithole hopes new skills will be developed in the field of supercomputing, through the adoption of HPC by most scientific and engineering research areas.

“Supercomputing is key in driving the competitive industries and economy, especially when infrastructure is complemented by skilled people,” says Sithole.

“These skills should be drawn by the visibility of industry in absorbing these people in terms of tangible jobs, which in turn helps reduce concept-to-market time for our industry and improves competitiveness of the country in general.”

According to Sithole, advances in technology are no longer a problem - they're coming fast and furiously. The challenge now is to familiarise people with supercomputers, how to use them, and the ways it could change the world as they know it.

Read how supercomputers are helping doctors understand the dynamics of heart attacks.

Share