Seeing AI: Helping blind and low-vision people navigate the world with an app
Having lost his eyesight at the age of 7, Microsoft software engineer Saqib Shaikh understands this firsthand.
He faced daily challenges such identifying products in a grocery store, reading restaurant menus, and seeing who's around him when walking down the street, he began using talking computers at a school for the blind.
Seeing the positive impact technology can have, Shaikh envisioned using computers to improve life for the low-vision community.
Shaikh and his colleagues have been exploring how artificial intelligence (AI) can empower the blind and low-vision community to experience their surroundings more vividly.
Using Microsoft Cognitive Services APIs, and AI technologies such as machine learning, the engineers have built "Seeing AI," an app that reveals the visual world to blind and low-vision people. The technology can read text out loud, recognise people and their emotions, and even describe everyday scenes, such as a skateboarder performing a trick.
With Seeing AI, blind and low-vision people can use their iPhone or Pivothead SMART eyeglasses camera to better navigate their surroundings.
The free app can identify a product by its barcode and call out the name. It can read documents, including headings, paragraphs, and lists, allowing users to skim through to find the text they need.
It can also recognise people based on their faces, and provide a description of their appearance, including their gender and facial expression.
Microsoft's goal is to get Seeing AI technology into the hands of as many blind and low-vision people as possible.
Watch how Seeing AI is helping blind and low vision people.