I have a confession: I don’t like touchscreens. My fingertips have thick callouses from years of playing music. Some smartphones and tablets do better than others, but I often can’t make them work without pressing hard or tapping with a bent knuckle.
Other people find smartphones and tablets even more frustrating. Imagine not being able to hear audio cues or unlock a device. A two-fingered swipe or pinch isn’t effortless with osteoarthritis or with a finger in a cast. The World Health Organization estimates 295 million people worldwide are visually impaired, and nearly 40 million are blind. Ever answer a call or check Twitter without seeing what you’re doing?
Mobile devices don’t have to leave people behind. We’re still a long way from making mobile devices equally useful to everybody, but assistive technology has come a long way. Here’s how today’s latest mobile devices user clever software to overcome a lack of sight.
Back in the day
My first tech accessibility experience came as a teenager by way of Bill Navarte, a blind ham radio operator. I’d sometimes help him set up new gear and mark the knobs and switches, usually by notching them with a file or applying bumps of enamel paint. He had a system, and we documented it all in Braille. He could find and operate almost everything by touch.
Today, that approach doesn’t work because most devices have just a few multifunctional controls — it might be great industrial design, but many people are locked out. Bill’s now retired, but minces no words. “I can still run my ham station,” he told me via telephone, “but I can’t work the new stereo. Over Christmas I spent an hour trying to change the radio. It got into some ‘CD’ mode, but there was no way I could tell.”
Smartphones and tablets are worse: they have only a few physical buttons, and any part of a touchscreen can do almost anything. Mobile devices could be designed around accessibility, but it’s difficult: What might work for the deaf doesn’t work for the blind, and what works for the blind may not help folks who have difficulty with touch. So, accessibility focuses on adapting devices designed for the fully abled.
Apple and iOS
Right now, Apple is an undisputed leader in mobile accessibility.
“Apple wants to make its products accessible and can,” noted Toronto-based author and accessibility expert Joe Clark. “Accessibility is built into every single item Apple sells that remotely resembles a computation device, save for the iPod Classic. Apple values user experience. Not being able to use a product makes for a shitty experience.”
VoiceOver is the flagship accessibility feature in iOS. It’s easy in theory: Just drag a finger around the screen to have VoiceOver speak items. Double-tapping activates buttons, and simple gestures let users move between objects. VoiceOver has a learning curve (and any screenreader is usually a mind-bend for sighted users), but it’s remarkably effective. VoiceOver may be most stunning browsing the Web. Users can drag their finger around a page to pick up layout and contextual cues, and VoiceOver has an innovative “rotor” control gesture (think of turning a knob) to concentrate navigation on headings or lists or zoom in to read (or edit) character-by-character. Pop in an earbud and VoiceOver is silent to anyone nearby, and a three-fingered triple-tap hides the display: No worries about someone reading over your shoulder.
“My eyes are fine,” says Katherine Macquire, an office manager at a Seattle-area consulting firm, “but at work I use VoiceOver with a headset and the blank screen to keep in touch with my [homebound] mother. I love it. It’s totally private.”
VoiceOver works with the onscreen keyboard, speaking each character when it’s touched: just touch again to confirm. Entering text using VoiceOver takes time — but (with an Internet connection) users can also dictate into most apps.
Apple’s iOS also offers zoom and color inversion that work in any app. Text can also be enlarged, but only in Apple’s core apps like Mail, Contacts, and Messages. For people (like me) who struggle with gestures, Assistive Touch enables one-finger pinching, multi-finger gestures, and the equivalent of programmable gesture macros.
Google and Android
Google has been working on Android accessibility almost as long as Apple, with the first features appearing in Android 1.6 “Donut.” The results have been mixed, and when Google adds new features (as in Android 4.x) there’s no knowing if device makers or carriers will make them available.
Out of the box, Android accessibility was minimal before version 4.0 (“Ice Cream Sandwich”). Forget the touchscreen: Low-vision users need a physical keyboard and pointing device (like a directional pad) to get around. Most Android devices include three main accessibility features: TalkBack uses speech synthesis to describe events and actions, KickBack can vibrate gently when a user finds a control (useful for lock screens), and SoundBack provides audio feedback moving to a new control. Google also offers Eyes-Free Shell for accessing home-screen functions. These tools are available via Google Play — but, weirdly, they’re easier to get via the IDEAL Accessibility Installer. Many Android users prefer third-party tools like the Eyes-Free Keyboard (it has an on-screen directional pad, and while it chokes on menus and dialogs, it works for navigation) or Speil, an alternative to TalkBack. Folks with serious text-to-speech needs usually turn to third-party apps like SVOX’s Classic Text to Speech.
With Android 4.0, Google introduced Explore by Touch, which can identify onscreen elements when users drag over them. There’s also gesture-based linear navigation, and, like VoiceOver, TalkBack in Android 4.x works with Web content. This version of Android also makes the on-screen keyboard accessible: Users can move over letters then lift when they find the one they want. Android 4 aids low-vision users with options for larger type, inverted colors, and some pinch-to-zoom items.
Sounds great, but Android 4.x represents only about 40 percent of active Android devices today: More than half are Android 2.2 and earlier, including brand-new devices still being sold. There’s no guarantee any of these tools work with Samsung TouchWiz, HTC Sense, or other skins. For older versions of Android there’s no strong screen reader for browsing the Web.
“Android could have the same desire Apple has to make products accessible,” noted Clark, “but even that would not result in accessibility due to device fragmentation and a seriously outdated user base.”
How to cope
Voice commands and speech recognition (like Siri and Google Now) would have been accessibility features a few years ago; today they’re mainstream. Similarly, many hearing-impaired mobile users use FaceTime, Google Hangouts, and Skype video chat for signing or reading lips, when no one considers those services adaptive technology.
Stevie Wonder famously proclaimed “there is nothing that you can do on the iPhone or iPad that I can’t do.” That’s not strictly true — thousands of iOS apps aren’t accessible, and many Android users get by just fine with available accessibility options. But, right now, Apple is the out-of-the-box leader for general mobile accessibility.
Users who prefer Android are probably best-advised to look at Google’s Nexus phones and tablets. Their unadulterated Android works best with Google’s accessibility options and doesn’t depend on carriers for updates: it’s also the best bet for third-party accessibility apps.
In the meantime, take a look at what mobile accessibility can do for you — it should be a pleasant surprise.