Subscribe

The next frontier

Computing architecture inspired by the brain.

Kathryn McConnachie
By Kathryn McConnachie, Digital Media Editor at ITWeb.
Johannesburg, 07 Mar 2013
IBM's SVP and director of research, Dr John Kelly, says the intention is not to replace humans, but to rather provide new tools for living in a world of "enormous data".
IBM's SVP and director of research, Dr John Kelly, says the intention is not to replace humans, but to rather provide new tools for living in a world of "enormous data".

IBM's intelligent supercomputer, Watson, has come a long way since it became the champion of the popular quiz show, Jeopardy.

In developing Watson to become the ultimate supercomputer to help professionals make better informed decisions in a world of big data, IBM has now turned its attention to building an entirely new underlying architecture for cognitive computing - inspired by the human brain.

Speaking at an open lecture held at Wits University recently, senior VP and director of IBM Research Dr John Kelly said: "We think it is possible to build a very interesting architecture that will be more human-like and more biologically inspired than what we've built in the past with brute force."

According to Kelly, the intention behind Watson is not to purely try and recreate the human brain: "But we must produce computer systems that will allow human beings to live in a big data world. If we don't - if we don't extract this information - we're starting to leave a lot of knowledge on the floor, and I think we'll just be completely overwhelmed by data and actually start to make bad decisions."

Watson recently completed its "medical residency" and has produced its first commercially available applications for doctors and health insurance companies.

Despite Watson's achievements thus far, it still has a long way to go, as Kelly explains: "The brain consumes roughly 20W of power, it's a very efficient machine considering what it does. By comparison, Watson in Jeopardy consumed 85 000W - it is still leaps away from what the human mind and brain can do."

Radically different

Dharmendra Modha, manager of cognitive computing systems and master inventor at IBM, says for over half a century, computers have been "little better than calculators with storage structures and programmable memory". On the other hand, the human brain remains the world's most sophisticated computer. "[The brain] can perform complex tasks rapidly and accurately using the same amount of energy as a 20W light bulb in a space equivalent to a two-litre soda bottle.

"Making sense of real-time input flowing in at a dizzying rate is a Herculean task for today's computers, but would be natural for a brain-inspired system," says Modha.

We're not trying to replace humans, we're trying to bring a new set of tools to the party that will allow humans to be much more effective in this world of enormous data. As long as in those critical situations that require judgement and creativity, we have a human being involved, it will be just fine.

As a result, IBM's researchers (and a team of collaborators from multiple universities) have been working on a major cognitive computing project called Systems of Neuromorphic Adaptive Plastic Scalable Electronics (SyNAPSE).

"We are trying to bring together neuroscience, supercomputing and nanotechnology to create a radically different computer architecture that mimics the function, low power, small size and real-time of the human brain," says Modha.

5in5

IBM's 2012 5in5 forecast of inventions that will change the world in the next five years focused on five sensory categories:

TOUCH - infrared and haptic technologies will enable a smartphone's touch-screen technology and vibration capabilities to simulate the physical sensation of touching something.

SIGHT - IBM says "computer vision" will be able to analyse patterns to make sense of visuals in the context of big data. This could be applied in industries such as healthcare, retail and agriculture.

HEARING - sensors will be able to pick up sound patterns and frequency changes and can be used to predict weakness in structures before they buckle, decipher the meaning behind a baby's cry, or advance dialogue across languages and cultures.

TASTE - IBM is working on a way to "compute" perfect meals by using an algorithmic recipe of favourite flavours and optimal nutrition for any given individual.

SMELL - computers will be imbued with a sense of smell by virtue of sensors that will detect and distinguish odors, chemicals, biomarkers and even molecules in the breath that affect personal health.

Phase two of the SyNAPSE project has been awarded about $21 million by the Defense Advanced Research Projects Agency (DARPA). The project involves the use of advanced algorithms and silicon circuitry to enable cognitive computers to learn through experiences, find correlations, create hypotheses, and to remember and learn from the outcomes.

So far, Modha says they have successfully created breakthrough chips on the scale of a worm brain, and now they're on the path to create a new chip on the scale of the human brain. The chips are currently able to recognise simple images, numbers and letters, all through pure learning and zero programming.

"In the quest for cognitive computing, we have mapped the largest long-distance wiring diagram from the monkey brain. Each bundle of wire represents a physical connection the brain makes over long distances. Taken as a whole, this is the most comprehensive glimpse that we've had into the structure of the brain and how the structure's modular dynamics give rise to behaviour," says Modha.

A new platform

Kelly says the intention is to try and understand the patterns within the brain: "Neurons and synapses are not a clear network compared to the way we structure today's computers with 1s and 0s on a bunch of layers of capabilities.

"We are trying to physically produce a new underlying structure for a computer system that will be truly cognitive. We will still be off by about three orders of magnitude in terms of the density of synapses and neurons in the human brain; we'll also still be off by about two orders of magnitude in terms of power consumption.

"But this will give us, literally by the end of this year, a new platform upon which to experiment in learning techniques and new architectures that are based on these massive networks rather than traditional computing architectures."

IBM envisions being able to package the computational power of a human brain in a container the size of a shoebox. Kelly says, by the end of the year, the company will have an extremely powerful chip. "We will put hundreds of these devices together into something like a box and hopefully we will have a device that will be a truly learning system, and one that is the next step in the era of cognitive computing."

According to Kelly, the next-generation Watson is "a really big deal". The first-generation Watson in Jeopardy took a single question and presented a single answer. But that is not the way most complex problems present themselves; they certainly don't present themselves in healthcare that way.

Learning machine

The supercomputer is named after IBM co-founder Thomas Watson and is a project of IBM's research labs. Watson 1.0 was the first generation of a cognitive computing system, with the ability to exercise judgement in game theory, and have some perception of its environment.

According to IBM, Watson's performance has improved by 240% since it rose to prominence by beating the reigning human champions at the popular quiz show, Jeopardy, two years ago. The supercomputer can take a question in natural language and search all of the data that has been fed into its system and find the correct answer through statistical ranking.

Watson, at its core, is a learning machine. Kelly says: "It literally gets smarter the more it's used. One thing the other Jeopardy contestants didn't realise is that Watson actually knew more about them than they knew about themselves. It had studied what they knew and their game behaviour. So in a sense, it had some perception of the environment it was in."

"In healthcare, as in many situations, you are presented with many different pieces of information. Some of it contradictory and some of it incomplete; what you want to do is get those different pieces of information down to a set of possible causes and some statistical weighting of those. This new technology does that."

Kelly says images will be the next big thing for Watson in the medical field: "The goal here is to produce a system that will be as accurate, or more accurate, than a radiologist.

"Where we want to go with Watson is not just question and answer, and not just using paths to find things. We have very dense research going on in complex analysis and complex interactions with Watson," says Kelly.

Rise of the machines

When asked if he foresees supercomputers such as Watson ever actually replacing humans in certain situations, Kelly says: "There will be an explosion in machine-to-machine interactions, and I can see a Watson-like machine taking over decision-making in that type of non-critical situation where the risk level is reasonable. Where I do not see Watson or cognitive systems going is in replacing the final human judgement - that is a very difficult thing to do."

Kelly says an important factor is that, thus far, they have found no evidence to suggest computers are capable of being creative. "Humans have this ability that, no matter how much data and experience we have, we can still create outside of our knowledge. While Watson has surprised us and we've wondered how it knew something, we've always been able to trace back to where it found the information, we've never found it being creative."

Dr Watson

Watson 1.0 was put to work at Wellpoint and Memorial Sloan-Kettering Cancer Centre for over a year, and earlier this month, IBM announced the launch of the first two commercial applications that have been created as a result of Watson's "residency" at these centres.

This is a result of the supercomputer ingesting more than 600 000 pieces of medical evidence, two million pages of text from medical journals and clinical trials, and 1.5 million patient records, while also being taught to analyse and interpret the information.

One of the commercial applications helps to assess treatments for lung cancer while the other helps to manage health insurance decisions and claims. IBM says: "In both applications, doctors or insurance company workers will access Watson through a tablet or computer. Watson will quickly compare a patient's medical records to what it has learned and make several recommendations in decreasing order of confidence. In the cancer program, the computer will be considering what treatment is most likely to succeed. In the insurance program, it will consider what treatment should be authorised for payment."

In healthcare, Kelly says Watson will never replace a doctor: "You will always want a doctor as a final decision-maker. But we will reach a point where, as a patient, you will demand that your doctor has access to a Watson, because the amount of information that is out there is simply beyond what a human being can possibly know.

"We're not trying to replace humans, we're trying to bring a new set of tools to the party that will allow humans to be much more effective in this world of enormous data. As long as in those critical situations that require judgement and creativity, we have a human being involved, it will be just fine," says Kelly.

According to Kelly, the way healthcare and medical best protocols are produced in large medical institutions currently involves a group of highly experienced doctors and experts sitting around a table and discussing and sharing different protocols and outcomes, and then deciding on best practice based on that.

"Think about having a very intelligent Watson at that table, as not only a resource to search massive amounts of data and statistical rankings of different protocols, but actually being able to say: 'You decided that this was the best protocol, but let me tell you why you would probably want to reconsider that'.

"It might actually be able to debate and present new information to decision-makers. That will have a profound impact on the role of IT in literally every industry."

Think like humans

Chief innovation officer at IBM Bernard Meyerson says Watson marks a turning point in computing. "But while Watson can understand all manner of things and learns from its interactions with data and humans, it is just a first step into a new era of computing that's going to produce machines that are distinct from today's computers as those computers are from the mechanical tabulating devices that preceded them. A host of technologies is coming that will help us overcome our limitations and transform the way we interact with machines and with each other."

he SyNAPSE project is seeking to accurately map and understand the patterns of neurons and synapses in the human brain in order to use it as a blueprint for the next-generation cognitive computing architectures.
he SyNAPSE project is seeking to accurately map and understand the patterns of neurons and synapses in the human brain in order to use it as a blueprint for the next-generation cognitive computing architectures.

Meyerson says over the coming years, computers will become increasingly adept at dealing with complexity: "Rather than depending on humans to write software programs that tell them what to do, they will program themselves so they can adapt to changing realities and expectations. They'll learn by interacting with data in all of its forms - numbers, text, video, etc. And, increasingly, they'll be designed so they think more like the humans.

"This isn't about replacing human thinking with machine thinking," says Meyerson, adding that such a thing is unnecessary. "Rather, in the era of cognitive systems, humans and machines will collaborate to produce better results - each bringing their own superior skills to the partnership. The machines will be more rational and analytic. We'll provide the judgement, empathy, morale compass and creativity.

"In my view, cognitive systems will help us overcome the 'bandwidth' limits of the individual human."

Share