In the wake of yesterday’s iPhone event, I’ve been doing a lot of thinking about one of the iPhone 5S’s marquee features, the fingerprint scanner. Dubbed “Touch ID”, the scanner’s intended purpose —- for now, at least —- is to aid users in quickly unlocking their phone and make purchases across Apple’s digital storefronts.
A thought that has persisted in my mind has been how, theoretically, Touch ID could serve as an accessibility tool. The security aspect of using the fingerprint sensor is an obvious one, but I can imagine it also being beneficial to users with visual and/or motor issues (e.g., seeing the keypad and/or having dexterity to tap said keys) who have trouble touching their iPhone’s screen. (Apple has anticipated such a dilemma by including the AssistiveTouch feature on iOS.) What I see Touch ID doing is helping people with the aforementioned acuity/motor issues by allowing them to simply use their thumbprint (or other finger) to unlock their phone, password-free. More specifically, Touch ID would free users from the struggle of manually entering in their passcode.
My idea here is not so much of convenience (which is nice) but rather of usability. I know many folks with vision-and motor-related issues who bemoan iOS’s passcode prompt because not only does it take time, but also entering in said code isn’t necessarily an easy task. In fact, more than a few lament this so often that they forego a passcode altogether because it’s time-consuming and a pain (sometimes literally) to enter.
The Touch-ID-as-a-accessibility-tool idea has roots in my prior life as a Special Education Paraeducator. I worked for two years in a moderate-to-severely handicapped special day class at a junior high school. Every Friday, we would have one of the district’s AAC (Augmentative & Assistive Communication) specialists come into the classroom to lead a large-group cooking activity. Several of our students at the time were non-verbal and had severe motor issues, so the specialist would run a cable from whatever appliance we were using (e.g., blender) to a bigMACK communication device. This does two things: (1) tapping the bigMACK gives auditory feedback of whatever is spoken into it by the staff (e.g., “I’m turning on the blender!”); and (2) tapping the switch also makes the blender work. It’s a very cool two-for one deal.
The same two-for-one concept, I think, can apply to Touch ID. Not only is the fingerprint scanner quicker and more secure, but it also is just plain easier and better to use for those with disabilities. Everyone’s needs are different, of course, but it is for these reasons that I believe Touch ID will be greatly appreciated by certain subsets of the accessibility community. Moreover, Touch ID-as-accessibility has the potential to be the “happy coincidence”, accessibility-wise, that Markdown was for me. I can see many iPhone 5Ss sold on this alone, in a way not dissimilar to why I bought my trusty 4S in 2011: Siri.
iOS 7 represents a huge adjustment for those in the accessibility community, and from what I hear, has some great new accessibility features. I strongly believe Touch ID, however unintentionally, is a true dark horse feature for those in the accessibility community.