Subscribe

Cognitive computing on the rise

Staff Writer
By Staff Writer, ITWeb
Johannesburg, 03 Jan 2013

Technological innovations are heading into an era of cognitive systems whereby machines will learn, adapt, sense and begin to experience the world as it really is.

This is according to IBM's seventh annual "IBM 5 in 5" - a list of innovations the company believes have the potential to change the way people work, live and interact during the next five years.

The company says this year's predictions focus on one element of the new era: the ability of computers to mimic human senses in their own way - to see, smell, touch, taste and hear.

These sensing capabilities will help us become more aware, productive and help us think - but not think for us, the computing giant explains.

Cognitive computing systems will help us see through complexity, keep up with the speed of information, make more informed decisions, improve our health and standard of living, enrich our lives, and break down all kinds of barriers - including geographic distance, language, cost and inaccessibility, it adds.

"IBM scientists around the world are collaborating on advances that will help computers make sense of the world around them," says Bernie Meyerson, IBM Fellow and VP of innovation.

"Just as the human brain relies on interacting with the world using multiple senses, by bringing combinations of these breakthroughs together, cognitive systems will bring even greater value and insights, helping us solve some of the most complicated challenges."

Below are IBM's five predictions that it says will define the future.

Touch

According to IBM, scientists are developing applications for the retail, healthcare and other sectors using haptic, infrared and pressure-sensitive technologies to simulate touch, such as the texture and weave of a fabric, as a shopper brushes his or her finger over the image of the item on a device screen.

Utilising the vibration capabilities of the phone, it notes, every object will have a unique set of vibration patterns that represents the touch experience: short, fast patterns, or longer and stronger strings of vibrations. The vibration pattern will differentiate silk from linen or cotton, helping simulate the physical sensation of actually touching the material, it points out.

IBM says current uses of haptic and graphic technology in the gaming industry take the end-user into a simulated environment.

However, it states that the opportunity and challenge here is to make the technology so ubiquitous and inter-woven into everyday experiences that it brings greater context to our lives by weaving technology in front and around us. This technology will become ubiquitous in our everyday lives, turning mobile phones into tools for natural and intuitive interaction with the world around us, IBM adds.

Sight

In the next five years, says IBM, systems will not only be able to look at and recognise the contents of images and visual data, they will turn the pixels into meaning, beginning to make sense out of it, similar to the way a human views and interprets a photograph.

In the future, it adds, 'brain-like' capabilities will let computers analyse features such as colour, texture, patterns or edge information and extract insights from visual media. The company believes this will have a profound impact for industries such as healthcare, retail and agriculture.

"Within five years, these capabilities will be put to work in healthcare by making sense out of massive volumes of medical information such as MRIs, CT scans, X-Rays and ultrasound images to capture information tailored to particular anatomy or pathologies.

"What is critical in these images can be subtle or invisible to the human eye and requires careful measurement. By being trained to discriminate what to look for in images - such as differentiating healthy from diseased tissue - and correlating that with patient records and scientific literature, systems that can 'see' will help doctors detect medical problems with far greater speed and accuracy."

Hearing

IBM believes that, within five years, a distributed system of clever sensors will detect elements of sound, such as sound pressure, vibrations and sound waves, at different frequencies. The system will interpret these inputs to predict, for example, when trees will fall in a forest or when a landslide is imminent, says IBM. Such a system will "listen" to our surroundings and measure movements, or the stress in a material, to warn us if danger lies ahead.

"Raw sounds will be detected by sensors, much like the human brain. A system that receives this data will take into account other 'modalities', such as visual or tactile information, and classify and interpret the sounds based on what it has learned. When new sounds are detected, the system will form conclusions based on previous knowledge and the ability to recognise patterns," the computing giant explains.

"For example, 'baby talk' will be understood as a language, telling parents or doctors what infants are trying to communicate. Sounds can be a trigger for interpreting a baby's behaviour or needs. By being taught what baby sounds mean - whether fussing indicates a baby is hungry, hot, tired or in pain - a sophisticated speech recognition system would correlate sounds and babbles with other sensory or physiological information, such as heart rate, pulse and temperature."

IBM also points out that, in the next five years, by learning about emotion and being able to sense mood, systems will pinpoint aspects of a conversation and analyse pitch, tone and hesitancy to help us have more productive dialogues that could improve customer call centre interactions, or allow us to seamlessly interact with different cultures.

Taste

IBM says its researchers are developing a computing system that actually experiences flavour, to be used with chefs to create the most tasty and novel recipes.

"It will break down ingredients to their molecular level and blend the chemistry of food compounds with the psychology behind what flavours and smells humans prefer. By comparing this with millions of recipes, the system will be able to create new flavour combinations that pair, for example, roasted chestnuts with other foods such as cooked beetroot, fresh caviar and dry-cured ham. A system like this can also be used to help us eat healthier, creating novel flavour combinations that will make us crave a vegetable casserole instead of potato chips."

According to IBM, the computer will be able to use algorithms to determine the precise chemical structure of food and why people like certain tastes. These algorithms will examine how chemicals interact with each other, the molecular complexity of flavour compounds and their bonding structure, and use that information, together with models of perception, to predict the taste appeal of flavours.

Smell

During the next five years, says IBM, tiny sensors embedded in computers or cellphones will detect if one is coming down with a cold or other illness.

By analysing odours, biomarkers and thousands of molecules in someone's breath, doctors will have help diagnosing and monitoring the onset of ailments such as liver and kidney disorders, asthma, diabetes and epilepsy by detecting which odours are normal and which are not, it adds.

Share