Subscribe

IBM's 3D-printed self-driving bus hits the road

Admire Moyo
By Admire Moyo, ITWeb's news editor.
Johannesburg, 21 Jun 2016
Passengers will be able to interact conversationally with Olli while travelling from point A to point B.
Passengers will be able to interact conversationally with Olli while travelling from point A to point B.

Computing giant IBM and Local Motors, a vehicle technology integrator and creator of the world's first 3D-printed cars, have partnered to unveil the first self-driving vehicle to integrate the advanced cognitive computing capabilities of IBM Watson.

The electric vehicle, dubbed 'Olli,' can carry up to 12 people and is equipped with some of the world's most advanced vehicle technology, including IBM Watson Internet of Things (IOT) for Automotive, to improve the passenger experience and allow natural interaction with the vehicle.

Olli will be used on public roads in Washington DC, and late in 2016 in Miami-Dade County and Las Vegas.

Watson is an artificially intelligent computer system capable of answering questions posed in natural language. The supercomputer shot to fame after it beat expert "Jeopardy" quiz show contestants in 2011, and is named after legendary IBM president Thomas Watson.

The supercomputer has traditionally been used in different industries, including healthcare, banking, contact centres and education.

"Olli offers a smart, safe and sustainable transportation solution that is long overdue," says John B Rogers, Local Motors CEO and co-founder. "Olli with Watson acts as our entry into the world of self-driving vehicles, something we've been quietly working on with our co-creative community for the past year.

"We are now ready to accelerate the adoption of this technology and apply it to nearly every vehicle in our current portfolio and those in the very near future. I'm thrilled to see what our open community will do with the latest in advanced vehicle technology."

XHead = Cloud-based cognitive computing

IBM says Olli is the first vehicle to utilise the cloud-based cognitive computing capability of IBM Watson IOT to analyse and learn from high volumes of transportation data, produced by more than 30 sensors embedded throughout the vehicle. Using the Local Motors open vehicle development process, sensors will be added and adjusted continuously as passenger needs and local preferences are identified.

Furthermore, the platform leverages four Watson developer APIs - Speech to Text, Natural Language Classifier, Entity Extraction and Text to Speech - to enable seamless interactions between the vehicle and passengers.

IBM added new and expanded cognitive application programming interfaces for developers that enhance Watson's emotional and visual senses during its InterConnect 2016 event in February this year.

Passengers will be able to interact conversationally with Olli while travelling from point A to point B, discussing topics about how the vehicle works, where they are going, and why Olli is making specific driving decisions.

Watson empowers Olli to understand and respond to passengers' questions as they enter the vehicle, including about or specific vehicle functions or even "are we there yet?", IBM says.

Passengers can also ask for recommendations on local destinations such as popular restaurants or historical sites based on analysis of personal preferences. These interactions with Olli are designed to create more pleasant, comfortable, intuitive and interactive experiences for riders as they journey in autonomous vehicles.

Share