Eyes in Your Pocket: “BlindTool” App Represents the New Frontier of Assistive Technology

A screen capture of the BlindTool app identifying a banana, with less likely predictions listed belowBy Brian Klotz

FastCompany calls it “a peek at an inevitable future of accessibility,” a new app called BlindTool that allows users to identify objects in real time using only their phone, and it was created right here in Boston. Developed by Joseph Paul Cohen, a current Ph.D. candidate at the University of Massachusetts Boston and a Bay State native, it aims to increase the independence of individuals who are blind or visually impaired by putting an extra set of eyes in their pocket.

“I’ve had a desire to do this for a while,” says Cohen, whose initial interest in assistive technology came from working with a colleague who was blind during an internship at the U.S. Naval Research Laboratory in Washington D.C., inspiring him to think of the ways modern technology could improve the lives of those with vision impairment.

The app runs on Android devices, and identifies objects it is pointed at in real time using a “convolutional neural network” that can understand 1000 “classes” of objects. While the technology behind it may be complex, its usage is simple: wave your phone around, and the app will cause it to vibrate as it focuses on an object it recognizes – the more it vibrates, the more confident it is. Once it’s fairly certain, it will speak the object aloud.

“It always has a prediction,” Cohen explains, regardless of where it is pointed, so the vibration function allows the user to zero in on objects the app has more confidence in identifying. Continue reading

Advertisements