How Teens Take, Edit, and Share Photos on Social Media
As a blind photography enthusiast, I was incredibly excited to explore how visually impaired teens interact with emergent social media like Instagram and Snapchat during an internship at Microsoft Research with Ed Cutrell and Merrie Morris. Most accessibility-related social media research focuses on totally blind, older people. Visually impaired people have unique accessibility needs and they want to use the vision they do have to engage with photos. Since teens are super users of newer social media like Instagram and Snapchat, we interviewed 16 to 19-year olds with visual impairments to learn their browsing and posting behaviors and challenges. We found that visually impaired teens engage with these social media like their sighted peers by editing photos according to trends. But they also edited photos to make them more visible raising questions about expanding research on color visibility and vision impairment, since most focuses on text and background optimization as opposed to optimizing the complex color pallets found in photos.
Additionally, reading ephemeral content was difficult for our participants, and they grappled with social implications that screenshotting snaps was an invasion of privacy as screenshotting gave them additional viewing time they needed to understand snaps.
Our research confirms that visually impaired people are attempting to engage with social media as much as they can, but they still encounter frustrating barriers. We hope researchers and designers of social media prioritize increasing accessibility for this demographic which wants to engage with these social media more. The linked paper provides more insights into their photography and social media use and offers design recommendations to increase their accessibility.
Using a Design Workshop to Explore Accessible Ideation
Some Human-Centered Design activities, like brainstorming, may not be accessible as they are popularly taught. For example, many people learn to brainstorm by hand-drawing ideas in a short amount of time. this might not be feasible for people with vision and motor impairments. we conducted a workshop with 40 engineering professionals, including 7 with disabilities. they first ideated in teams on a design challenge. Second, they reflected on access barriers. Finally, they ideated and rapidly prototyped potential solutions to alleviate the access barriers.
In summary, offering a multitude of ideation materials can better include people with vision and motor impairments (tactile and 3d materials, computer-based brainstorming for example) taking turns sharing ideas can help ensure people with hearing impairments, those who are less outgoing, and remote participants have a say, and organizing ideas can help people with visual and cognitive impairments better follow along. One team found that dividing tasks according to strengths helped involve everyone, recommending people take learning style or strengths assessments before diving into design challenges.
The linked poster abstract offers more details about this project and demonstrates that a workshop can be a fruitful method for quickly trying potential accessible solutions. In conclusion, if ideation is done more flexibly, it could allow all team members to brainstorm with methods that suit their strengths and abilities.
Prostheses and Making
As making meets assistive technology, we were interested in how people with upper limb loss incorporate that experience and prostheses into their identities and negotiations of social norms. One example of the two communities meeting is e-NABLE, which coalesces volunteers to partner with a user to redesign open-source hand designs on Thingiverse to fit the user’s needs, and 3D-prints the hands for no cost. We interviewed 14 people with upper limb loss. Some used prostheses prescribed by a prosthetist. Some used e-NABLE hands, and others did not use prostheses. We learned that similar to other objects, participants incorporate prostheses, other objects appropriated as assistive technologies, and limb loss into their identities. Participants also negotiated normalcy, fluctuating between striving to appear as someone with two arms and other times, explicitly drawing attention to limb loss or prostheses. From our participants, we highlight the potential for communities like e-NABLE to make prostheses low cost and customizable. Yet, with that comes a need to step back and consider potential pitfalls such as the fact that none of the participants who had tried one use their e-NABLE hands, and design and assembly for our participants was not accessible. The HCI community should consider what agency users have as making and assistive technology further entwine, and prioritize thought and action to account for the intimate, personal relationships people develop with assistive objects.
This project was advised by Daniela Rosner and Kat Steele, and I collaborated with Keting Cen. Watch our CHI video preview and download our 2016 paper.
Empowering Blind Students in Science and Engineering
During the first phase of this project, we learned that vision impaired people have difficulty finding the bus stop. Participants told stories of not standing precisely at a bus stop sign and being left by passing busses unaware they were waiting. We brainstormed with participants and identified several landmarks of bus stops they would like to know about ahead of time. In our first paper, we report these interview findings.
We then developed StopInfo, an extension to OneBusAway, a popular transit app in the Seattle area reporting real-time bus arrival times for bus stops. We added a page to each bus stop arrival times’ page that tells users what landmarks they can expect to find near the stop. For example, we report which side of the intersection to find the bus stop, the shape of the bus sign, and whether it has a shelter. We deployed StopInfo to the public, and anyone using OneBusAway, could verify and add data to our original set that came from our project partner, King County Metro Transit. Six vision impaired participants used StopInfo in their daily lives and reported on transit trips. We learned that StopInfo was accessible and the information was helpful. Some of our participants tried going places on the bus they hadn’t before.
StopInfo was a recipient of a 2014 Ford College Community Challenge award. It was led by Meg Campbell and advised by Alan Borning.Caitlin Bonnar collaborated.
Watch our video, read a press release download our 2013 Assets best paper and our 2014 Assets paper.
Tactile Graphics with a Voice
Vision impaired people find raised-line drawings helpful to succeed in math and science. Typically, these drawings, called tactile graphics, are labeled with braille, but not all vision impaired people read braille, and often, the braille necessary for a label is too large for the space. Thinking to store information in qr codes adhered to tactile graphics, we first interviewed vision impaired people to learn how they use smartphone cameras and their experiences with tactile graphics in general.
We used the formative data to develop Tactile Graphics with a Voice, an iOS app that provides multiple modes of feedback to help the user to photograph a qr code. Users can either hear verbal feedback, no feedback, or they can use their finger to identify the qr code they wish to photograph in which case the app captures the qr code closest to the finger. We asked vision impaired participants to use Tactile Graphics with a Voice photograph qr codes on a variety of types of tactile graphics in a longitudinal study. Over time, participants used the app to photograph qr codes more quickly, and no mode was more accurate than the others. They all appreciated having the option to use multiple modes of feedback.
This project was led by Catherine Baker, advised by Richard Ladner, and other collaborators include Lauren Milne, Jeffry Scofield, and Ryan Drappeau.
Download our best student 2014 Assets paper.