Computing for the greater good
Wits University is developing a processing board for use at the Large Hadron Collider that it hopes to commercialise for wider application.
IBM's poster-boy of supercomputing, Watson, has often grabbed the headlines for demonstrating the potential of massive computing power - in Watson's case, this has focused on its artificial intelligence capabilities.
Similar ambitions are being explored at Wits University, where work is feverishly underway to produce a supercomputer that could propel South Africa onto the world stage.
That would be a stage adjacent to the high-performance computing platform as these home-grown ambitions focus on high-throughput computing. The former is about harnessing CPU power, while the latter is about maintaining a constant flow of signal and data processing.
The significance of this work and potential for world recognition is that the technology is being developed for use at the Large Hadron Collider (LHC) that is scheduled for a major overhaul in the early 2020s. This hardware will be used specifically in the ATLAS detector that confirmed the existence last year of the Higgs Boson particle.
South Africa's involvement in a global project of this magnitude and significance may come as a surprise to the uninitiated, but the country actually has a long-standing involvement with CERN, the European particle physics laboratory outside Geneva, Switzerland.
The Wits School of Physics, in collaboration with the electrical and information engineering faculties, is developing a 19-inch processing board able to maintain far greater data-processing rates. While this in itself isn't groundbreaking, the underlying technology is.
"We're aiming at a board that should be able to sustain throughput of one terabit per second," says Professor Bruce Mellado of the Wits School of Physics. "The long-term goal is to provide hardware by the beginning of the 2020s, although, of course, we will be ready long before that."
Higher data rates
Particle physics as practised at the ATLAS TILE Calorimeter currently requires data transfer rates of 320MB/s during operation. The upgrade of the facility will result in throughputs of a significantly higher magnitude.
Mellado says that while the technology and capabilities exist to produce boards that can handle these higher data rates, they inevitably carry a cost in relation to those capabilities. The Wits team is therefore developing its platform on ARM processors that are commonplace today in smartphones and tablets. These processors are known for enabling higher processing while consuming a lot less power and generating less heat.
These advantages, Mellado believes, will enable Wits to produce a board for around $10 000 that has the added benefits of lower power consumption and lower related infrastructure requirements.
"We've reported to the community and so far I think it's very promising. Right now, we have the basis to say we're going to make it. I can't say what the exact price will be by the time we're done, but the $10 000 price is achievable with today's technology," Mellado says.
Instead of focusing solely on the prestige that would come the way of the university and South Africa's scientific community, Mellado believes these advances can be commercialised to deliver high throughput computing on mid- to low-cost hardware.
This would have a natural audience in the research and science communities, but equally in industries that have high processing requirements to tackle problems such as interrogating big data.
The Department of Science and Technology (DST) has already indicated an interest in looking to see how this kind of hardware could be applied in workstations or even laptops at around $250 a unit, Mellado says.
These boards would ostensibly provide the processing power of a high-end laptop with the added capacity of handling data throughput of up to five gigabits per second.
"At the pace we're progressing and the amount of manpower we're able to put together, I'm optimistic we can have our first working prototype by the end of 2014," he says. "Then, within one to two years after that, we expect the first stage of this project to be completed by delivering a board with low to mid-end core technology that sustains a terabit per second.
"That kind of capability has the potential to be a product embedded in the big sciences and big industry. How that will be commercialised is not something I can predict, but I'm sure the DST and university will find way to make it happen."
These plans may seem overly ambitious, but then so did the concepts of the World Wide Web and cloud computing that both stem from ideas hatched at CERN to improve the facility's abilities.
First published in the April 2014 issue of ITWeb Brainstorm magazine.