Apple Says Mouse Support for iOS 13, iPadOS Will Have Widespread Appeal Beyond Accessibility

Apple’s newest accessibility-related features appeal to many more users than they are actually intended for.

Sarah Herrlinger, director of Global Accessibility Policy & Initiatives at Apple, tells TechCrunch‘s Steven Aquino that the company realizes that mouse/pointer support in iPadOS and iOS 13 will appeal to a wide range of folks though it’s primarily implemented as an accessibility feature:

Mouse support lives in the AssistiveTouch menu, the suite of options designed for users with physical motor delays who can’t easily interact with the touchscreen itself. Apple says it works with both USB and Bluetooth mice, although the company doesn’t yet have an official compatibility list. It’s telling how mouse functionality is purposely included as an accessibility feature — meaning, Apple obviously sees its primary value as a discrete assistive tool. Of course, accessibility features have far greater relevance than simply bespoke tools for disabled people […]

People without disabilities will use this feature, regardless of its actual intended utility, and Apple recognizes that. No one will stop you from plugging a mouse into your iPad Pro. It’s no different from someone using Magnifier to get up close on a finely printed restaurant menu or using Type to Siri in order to quietly give commands in a Messages-like environment.

“Accessibility features can benefit more than the original community they were designed to support,” Herrlinger said. “For example, many people find value in closed captions. Our goal is to engineer for specific use cases so that we continue to bring the power of our devices to more people.”

So not only will many of the new accessibility-focused features be incredibly useful to users who physically need them, they will appeal to many more.

“One of the things that’s been really cool this year is the [accessibility] team has been firing on [all] cylinders across the board,” Herrlinger said. “There’s something in each operating system and things for a lot of different types of use cases.”

The interview focuses primarily on the effort that Apple has put into these new accessibility features. The Cupertino company, for example, knows how important its Voice Control function is for people who cannot physically interact with devices:

Herrlinger told me Voice Control, while conceptually fairly straightforward, is designed in such a way to be deep and customizable. Furthermore, Herrlinger added that Apple has put in a ton of work to improve the speech detection system so that it can more adeptly parse users with different types of speech, such as those who stutter. Over time, Voice Control should improve at this.

Read Aquino’s worthwhile article over at TechCrunch.