In a heartfelt demo that rounded off the Microsoft
Build conference keynote this year, software engineer Saqib Shaikh
outlined an ongoing research project that uses machine learning and
artificial intelligence to help visually impaired or blind people to
better 'see' the world around them.
|
|
London-based Shaikh, who has been blind since the age of seven, said
that talking computer technology inspired him to develop the application
– titled SeeingAI – that is built upon Microsoft Intelligence APIs to
translate real-world events into audio messages.
The application is intended to work on both smartphones as well as
PivotHead smartglasses. The video demonstration, below, depicts Shaikh
taking a picture with his glasses which then describe to him exactly
what they 'see' – from business meetings to teenagers skateboarding on
the streets of London to a woman throwing a Frisbee in a park. While
another scene demonstrates how the smartphone app uses a device's camera
to take a picture of a menu then translate the text into audio.
|
|
"I love making things that improves people's lives and one of the things
I've always dreamt of since I was at university was this idea of
something that could tell you at any moment what's going on around you,"
Shaikh said in the presentation.
"Years ago this was science fiction. I never thought it would be
something that you could actually do but artificial intelligence is
improving at an ever-faster rate and I'm really excited to see where we
can take it. As engineers we are always standing on the shoulders of
giants and building on top of what went before. For me it's about taking
that far-off dream and building it one step at a time."
|
|
The keynote gave no indication of when, or even if, the project will be
released as a commercial app however if it comes to fruition it will
comfortably sit beside the other futuristic technology announced during
Build – from the cutting-edge HoloLens to enhanced Bot integration.
|
|
|