Variant Vision

Smart Glasses for the Blind

5 years, 7 months ago | SKILLS: CAD / electronics / programming



How can we design a wearable device to empower independence for the visually impaired? 

Impact

Disability  |   Smart Wearables

 

Design Skills

Electronics  |  Programming   | CAD   | Outreach    |  User Research

 

Team Size

Two – I partnered with a wonderful engineer named Oyinda Alliyu

 

According to dozens of community groups, special care centers, and disability rights organizations, one of the most pervasive problems for the blind and visually impaired is the empowerment of independence. More than 70% of adults with significant vision loss are not employed full-time, largely due to mobility. Only 2-8% are able to use a white cane, with the rest relying on a guide dog, a sighted aide, or staying at home1. Existing assistive technology that deviates from the traditional cane is often difficult to use, overly expensive, or amplifies disability by drawing too much attention to the user. For the blind, independent navigation is inaccessible. Variant Vision represented a summer-long effort to work directly with blind disability activists to develop low-cost, wearable navigation technology with open-source hardware. During this project I met Phillip, a former firefighter whose lifelong degenerative retinal disease pivoted his career towards blind and disability advocacy. Phillip learned to navigate using a white cane without expensive training, and, alongside his visually impaired partner Honorata, provided us with key insights and firsthand experiences that informed the iterative development of the Variant Vision Smart Glasses.

 

On our 2-person team, I led the research into exisiting low-cost, subtle object detection electronic hardware. Ultrasonic rangefinding, adaptable and programmable with Arduino, proved to be a reliable form of hazard detection. Rapid prototyping with clay modeling materials informed the form and function of haptic or aural feedback delivery. I produced multiple working electronic prototypes. Desirability of features such as volume control and detection sensitivity were evaluated during live user testing. In later prototypes I reduced the electronic footprint by incoporating smaller microcontrollers and batteries. Our final presented prototype encased all electronics into a 3D-printed glasses frame with embedded buzzer motors. All prototype design changes were informed by the feedback from Phillip and Honorata, who tested the smart glasses in their neighborhoods. Here, Honorata confirms the existence of a street sign previously detected by the smart glasses.

 

When state of the art isn’t good enough.

During our user research, we met Phillip, a former Connecticut volunteer firefighter who was diagnosed with a degenerative retinal disease when he was 13. In 2009, he became fully blind, and dedicated his new career to become an advocate for blind and disability rights. Phillip, who stubbornly learned to use a white cane on his own, gave us incredible insight into the problems he faced while navigating through his home city of Stamford. He explained that his white cane was able to help him navigate generally, by detecting obstacles in his path, but it didn’t provide any help detecting obstacles above his waist. Phillip had broken his nose 6 times, running into an open cabinet or overhanging sign that he could not see. His solution? Wear a hat. With the longest-brimmed baseball cap he could find, Phillip was able to reduce the severity of his future crashes, since the brim could usually hit an obstacle before his face did.

To quickly evaluate the viability of low-cost electronic hardware, I tested the hazard-detection responses of various different ultrasonic, infrared, and time-of-flight sensors. Ultrasonic was selected for its ease of implementation and wide cone of detection, one that mimics the detection range of a traditional white cane. While my team partner, Oyinda Alliyu, developed CAD models of early prototype frames, I experimented with different ways of delivering feedback to a user. With visual feedback not an option, I focused on integrating audio and haptic feedback into the electronic circuit. In both cases, vibration intensity and feedback volume depended on the proximity of the closest hazard.

 

 

 

 

 

 

 

 

As a lifelong user of lenses developed for sensitive eyes, Honorata’s assessment of our early prototypes’ form and comfort helped us make design changes that we implemented into the final 3D-printed prototype frame and into product concept modeling.  Phillip evaluated the detection capabilities of the glasses compared to his white cane. A key issue he presented early on was the inability of the cane to detect obstacles above the waist, such as open windows and wide street signs. His suggestions also encouraged us to move away from audio feedback, which could be muffled in an outdoor environment. Instead, we focused on delivering haptic buzzing on the temple above the ear.


 

 

 

 

 

 

 

 

 

 

 

 

A solution that works in the real world.

The final presented prototype of the Vairant Vision project was modeled to mimic the size of conventional glasses frames, shown here alongside my own. I combined my own CAD model of the eyeglass temples, which included a cavity to house the electronics, with my partner Oyinda’s model of the front frames. Created in SolidWorks, the smart glasses CAD model included dimensioned holes to integrate with outward facing electronic hardware, such as the ultrasonic rangefinder, the battery charger, and the on/off button.  

 

Phillip and Honorata stressed that the UV protection, comfort, and shape of any eyewear is often a deciding factor for anyone with a degenerative eye disease when choosing a pair of sunglasses. After prototype evaluation, I developed digital concept renderings of the detection sensors integrated into conventional and athletic UV-protecting eyewear.