Subscribe

Hello Watson

Cognitive computing is changing the world, one Watson at a time.

Jessie Rudd
By Jessie Rudd, Technical business analyst at PBT Group
Johannesburg, 04 Apr 2016

A few years back, big data landed in the world of analytics with a rather unflattering and unstructured thump, very nicely hash-tagged with phrases like 'the next big thing', 'powerful', 'unprecedented insight', etc. The sheer volume, velocity and variety had data analysts frothing at the mouth.

Fast-forward a few years and it has become increasingly apparent that the volume, variety and velocity is increasing exponentially, with absolutely no signs of slowing down or tapering off. If anything, it is going to get worse. The more IOT connected 'things' that are invented or added, the larger, faster and more disparate and 'uncontextualised' big data is going to get. This is a very large and fast-moving problem.

Already, there is way too much information with not enough talent to manage it or time to sift through it. The longer it stays untouched and unused, the more context is lost and the more data is lost to data decay.

For one moment, consider the following:

According to IBM Watson^1, unstructured data accounts for 80% of all data generated today. With the majority of that data being noisy, in formats that cannot be read by traditional systems and 'uncontexualised', this noisy and dirty data is expected to grow to over 93% by 2020^1.

Fuel - oil platforms can have more than 80 000^1 sensors in place. A single platform can produce more than 15 petabytes of data in its lifetime^1. Tools like Watson could help companies prevent drilling in the wrong place and help with flow optimisation (the volume and rate at which oil is pumped).

Healthcare - in your lifetime, you will generate 1 million GB of health-related data[1]. That is the equivalent of 300 million books. Imagine what a computer that can collate and predict quickly and accurately could do with that much information?

Transportation - By 2020, 75% of the world's cars will be connected^1. They will collectively produce approximately 350MB of data per second to be assessed and acted on. Self-driving and self-learning cars will soon be the norm. By their very nature, they will need to be able to learn and apply reasoning. Governments are not going to re-grid their entire road infrastructure.

Added to these scaling volumes is a huge shortfall of talented analysts and data scientists. Those that are around simply can't keep up with the ever growing volumes of data. This shortfall presents a massive problem for business, because even the most advanced data platforms are useless without experienced professionals to operate and manage them.

Answers please

So, then, what is the solution? More training and better academic programmes? Possibly, but the exponential nature of big data means users are always going to be playing catch-up. So, another solution needs to be found: A scalable and fast solution that can leverage insight at close to the same speed as big data is collated and collected; a solution that is as close to keeping the original context of the volume as possible. Say hello to Watson[2] and Coseer[3].

The future of big data is finding, scripting and training computers to do the work for people. Computers that think the way humans think, that use context to flesh out meaning, and can think outside of a rigid decision tree logic.

What will result is limitless possibility.

Computers that cognitively can make decisions and learn from each and every interaction, at speeds that humans can only dream of.

What, exactly, is cognitive computing?

Simplified, cognitive computing is the creation of self-learning systems that use data mining, pattern recognition and natural language processing (NLP) to mirror the way a human brain works, derives, contextualises and applies logic. The purpose of cognitive computing is to create computing systems that can solve complicated problems without the constant of human oversight, and in the process, far surpassing the speeds at which humans can do it.

What will result is limitless possibility. This is perhaps as close to true agile computing as it will ever be.

Cognitive computing is all about changing the world and entire industries, being able to see things that were lost in the volume, and finding insight that people have not been able to grasp before.

Today, 2.5 quintillion bytes of data^1 is created everyday - that is 1 000 000 000 000 000 000 000 000 000 000 bytes1. Every person on this planet will add 1.7Mb of data to that statistic, every second of today.

Human intelligence simply cannot scale in the same way that data is scaling, and cognitive computing enables people to deal with these massive amounts of data. Don't get me wrong - cognitive computing can never replicate what the human brain does. It is simply a system that can handle massive amounts of unstructured data, fast and accurately.

The insight that could be provided is immeasurable.

[1] https://developer.ibm.com/bluemix/2015/11/23/future-of-cognitive-computing
[2] http://www.ibm.com/za-en/
[3] https://coseer.com/#top

Share