To help visually impaired persons in a society predominantly developed around sighted population, I joined the interdisciplinary team of computer scientists, engineers, and designers in Disney Research, Pittsburgh, and initiated the project of “Tactile Display for the Visually Impaired” using TeslaTouch, a technology that adds tactile sensation to touch screens based on the electrovibration principle.
Through interviews and user study with early prototypes conducted at the Library for the Blind and Physically Handicapped in Pittsburgh, Pennsylvania, we identified several key applications that visually impaired users would benefit from.
Through quantitative and qualitative studies, we narrowed down to several key problems to solve with TeslaTouch:
/Navigate touch interface on phones, tablets, and ATM.
/Read maps and independently find way to a new location.
/Identify photos and share experience with people with normal vision.
User feel tactile feedback and texture on the 3M touchpad, while looking at the visuals on display. Application is build in C++ with openFrameworks libraries and Pure Data.
Besides iterations on software, I am also redesigning the physical device based on observation of blind users interacting with daily objects and unfamiliar hand hold device.
This assistive technology is demonstrated at ACM SIGCHI Conference, Vancouver on May 7-12, 2011.