Subscribe

Watson's sensory boundaries stretched

Admire Moyo
By Admire Moyo, ITWeb's news editor.
Las Vegas, 24 Feb 2016
Watson''s new APIs are pushing the sensory boundaries of how humans and machines interact, says IBM.
Watson''s new APIs are pushing the sensory boundaries of how humans and machines interact, says IBM.

IBM has added new and expanded cognitive application programming interfaces (APIs) for developers that enhance Watson's emotional and visual senses.

This was revealed at the IBM InterConnect 2016 conference taking place in Las Vegas this week. Watson is IBM's artificially intelligent computer system capable of answering questions posed in natural language.

IBM notes the new additions to the super computing system will extend the capabilities of Watson's set of cognitive technologies and tools.

Three APIs - Tone Analyser, Emotion Analysis and Visual Recognition - are now available in beta. Additionally, Text to Speech (TTS) has been updated with new emotional capabilities and is being re-released as Expressive TTS for general availability.

According to IBM, these APIs are pushing the sensory boundaries of how humans and machines interact, and they are designed to improve how developers embed these technologies to create solutions that can think, perceive and empathise.

"We continue to advance the capabilities we offer developers on IBM's Watson platform to help this community create dynamic AI infused apps and services," said David Kenny, GM of IBM Watson. "We are also simplifying the platform, making it easier to build, teach and deploy the technology. Together, these efforts will enable Watson to be applied in many more ways to address societal challenges."

IBM is also adding tooling capabilities and enhancing its SDKs (Node, Java, Python, and newly introduced iOS Swift and Unity) across the Watson portfolio, and adding Application Starter Kits to make it easy and fast for developers to customise and build with Watson. All APIs are available through the IBM Watson Developer Cloud on Bluemix.

In a statement, IBM says Tone Analyser has deepened its analysis capabilities in this beta release to give users better insights about their own tone in a piece of text. Adding to its previous experimental understanding of nine traits across three tones - emotion (negative, cheerful, angry), social propensities (open, agreeable, conscientious) and writing style (analytical, confident, tentative) - Tone Analyser now analyses new emotions, including joy, disgust, fear, and sadness, as well as new social propensities, including extraversion and emotional range.

Also new to the beta version, Tone Analyser is moving from analysing single words to analysing entire sentences. This analysis is helpful in situations that require nuanced understanding, says IBM. For example, in speech writing, it can indicate how different remarks might come across to the audience, from exhibiting confidence and agreeableness to showing fear, it adds.

In customer service, it can help analyse a variety of social, emotional and writing tones that influence the effectiveness of an exchange.

IBM has added Emotion Analysis as a new beta function within the AlchemyLanguage suite of APIs. The company explains that Emotion Analysis uses sophisticated natural language processing techniques to analyse external content and help users better understand the emotions of others.

IBM notes Visual Recognition is available now in beta and can be trained to recognise and classify images based on training material.

While other visual search engines can tag images with a fixed set of classifiers or generic terms, Visual Recognition allows developers to train Watson around custom classifiers for images - the same way users can teach Watson natural language classification - and build apps that visually identify unique concepts and ideas.

This means Visual Recognition is now customisable, with results tailored to each user's specific needs, says IBM. For example, a retailer might create a tag specific to a style of its pants in the new spring line so it can identify when an image appears in social media of someone wearing those pants.

Share