Wednesdays With Dean: A Volunteer’s Story

Dean posing for a photo, wearing a suit and sunglasses and holding a canePost by Stephanie Ross – Public Relations Assistant Account Executive for Eric Mower & Associates, VP Communications for the Boston Alumnae Chapter of Delta Gamma, freelance writer

[Stephanie is a one-on-one volunteer through the Massachusetts Association for the Blind and Visually Impaired’s Volunteer Program, which matches volunteers with individuals in their community who are blind or visually impaired.]

“Hello!” the deep voice echoes as I climb up the four stories to his Brookline apartment.

“Helloooo,” I say, mocking the voice.

Suddenly, a friendly laughter warms the cold stairwell.

This is how it starts every week.

While it’s only been a few months, my bond with Dean is profound. Two years ago, I left everything familiar in Texas and moved to Boston. I came with zero regrets, however something was missing. As a member of Delta Gamma at the University of North Texas, I was immersed in endless volunteer opportunities for the Service for Sight program. But being away from all of that – my sorority sisters, my family, etc. – I felt empty. I was stuck in the loop of working 9-5 and going home just to do it all over again.

Until I met Dean. Continue reading

Advertisements

Eyes in Your Pocket: “BlindTool” App Represents the New Frontier of Assistive Technology

A screen capture of the BlindTool app identifying a banana, with less likely predictions listed belowBy Brian Klotz

FastCompany calls it “a peek at an inevitable future of accessibility,” a new app called BlindTool that allows users to identify objects in real time using only their phone, and it was created right here in Boston. Developed by Joseph Paul Cohen, a current Ph.D. candidate at the University of Massachusetts Boston and a Bay State native, it aims to increase the independence of individuals who are blind or visually impaired by putting an extra set of eyes in their pocket.

“I’ve had a desire to do this for a while,” says Cohen, whose initial interest in assistive technology came from working with a colleague who was blind during an internship at the U.S. Naval Research Laboratory in Washington D.C., inspiring him to think of the ways modern technology could improve the lives of those with vision impairment.

The app runs on Android devices, and identifies objects it is pointed at in real time using a “convolutional neural network” that can understand 1000 “classes” of objects. While the technology behind it may be complex, its usage is simple: wave your phone around, and the app will cause it to vibrate as it focuses on an object it recognizes – the more it vibrates, the more confident it is. Once it’s fairly certain, it will speak the object aloud.

“It always has a prediction,” Cohen explains, regardless of where it is pointed, so the vibration function allows the user to zero in on objects the app has more confidence in identifying. Continue reading