Prostheses and Making
As making meets assistive technology, we were interested in how people with upper limb loss incorporate that experience and prostheses into their identities and negotiations of social norms. One example of the two communities meeting is e-NABLE, which coalesces volunteers to partner with a user to redesign open-source hand designs on Thingiverse to fit the user’s needs, and 3D-prints the hands for no cost. We interviewed 14 people with upper limb loss. Some used prostheses prescribed by a prosthetist. Some used e-NABLE hands, and others did not use prostheses. We learned that similar to other objects, participants incorporate prostheses, other objects appropriated as assistive technologies, and limb loss into their identities. Participants also negotiated normalcy, fluctuating between striving to appear as someone with two arms and other times, explicitly drawing attention to limb loss or prostheses. From our participants, we highlight the potential for communities like e-NABLE to make prostheses low cost and customizable. Yet, with that comes a need to step back and consider potential pitfalls such as the fact that none of the participants who had tried one use their e-NABLE hands, and design and assembly for our participants was not accessible. The HCI community should consider what agency users have as making and assistive technology further entwine, and prioritize thought and action to account for the intimate, personal relationships people develop with assistive objects.CHI video preview and download our 2016 paper.
Empowering Blind Students in Science and Engineering
I organized the Empowering Blind Students in Science and Engineering workshop which occurred in June, 2014 and joined successful blind and low vision STEM professionals with blind and low vision college students. The 2-day workshop allowed students to learn advocacy skills, alternative techniques for doing STEM class work, and mentoring.Engineering4Society Conference in June, 2015
During the first phase of this project, we learned that vision impaired people have difficulty finding the bus stop. Participants told stories of not standing precisely at a bus stop sign and being left by passing busses unaware they were waiting. We brainstormed with participants and identified several landmarks of bus stops they would like to know about ahead of time. In our first paper, we report these interview findings.
We then developed StopInfo, an extension to OneBusAway, a popular transit app in the Seattle area reporting real-time bus arrival times for bus stops. We added a page to each bus stop arrival times’ page that tells users what landmarks they can expect to find near the stop. For example, we report which side of the intersection to find the bus stop, the shape of the bus sign, and whether it has a shelter. We deployed StopInfo to the public, and anyone using OneBusAway, could verify and add data to our original set that came from our project partner, King County Metro Transit. Six vision impaired participants used StopInfo in their daily lives and reported on transit trips. We learned that StopInfo was accessible and the information was helpful. Some of our participants tried going places on the bus they hadn’t before.video, read a press release download our 2013 Assets best paper and our 2014 Assets paper.
Tactile Graphics with a Voice
Vision impaired people find raised-line drawings helpful to succeed in math and science. Typically, these drawings, called tactile graphics, are labeled with braille, but not all vision impaired people read braille, and often, the braille necessary for a label is too large for the space. Thinking to store information in qr codes adhered to tactile graphics, we first interviewed vision impaired people to learn how they use smartphone cameras and their experiences with tactile graphics in general.
We used the formative data to develop Tactile Graphics with a Voice, an iOS app that provides multiple modes of feedback to help the user to photograph a qr code. Users can either hear verbal feedback, no feedback, or they can use their finger to identify the qr code they wish to photograph in which case the app captures the qr code closest to the finger. We asked vision impaired participants to use Tactile Graphics with a Voice photograph qr codes on a variety of types of tactile graphics in a longitudinal study. Over time, participants used the app to photograph qr codes more quickly, and no mode was more accurate than the others. They all appreciated having the option to use multiple modes of feedback.
best student 2014 Assets paper.