Apple to Expand Accessibility Options in 2023
From Apple Newsroom:
Apple today previewed software features for cognitive, vision, hearing, and mobility accessibility, along with innovative tools for individuals who are nonspeaking or at risk of losing their ability to speak. These updates draw on advances in hardware and software, include on-device machine learning to ensure user privacy, and expand on Apple’s long-standing commitment to making products for everyone.
It’s well worth reading Apple’s press release for all the details (and see Shelly Brisbin’s commentary on Six Colors), but the feature I’m most interested to see is Assistive Access, which “distills experiences across the Camera, Photos, Music, Calls, and Messages apps on iPhone to their essential features in order to lighten their cognitive load for users.” Along with helping those with intellectual and developmental disabilities, I could see these alternative interfaces as a boon for older people (and others) with cognitive impairments. Possibly also for the very young, but I’m not a fan of giving toddlers digital devices.
Apple also says that Made for iPhone hearing devices will be able to be paired with select M1-based Macs and all M2-based Macs, which is good news for hearing aid users, and Voice Control will add phonetic suggestions so users can choose the desired word from those that sound similar, like “there,” “their,” and “they’re.” With luck, Apple will extend this and other text editing features to the standard dictation capabilities.
Adam said, “With luck, Apple will extend this and other text editing features to the standard dictation capabilities.”
I transcribe very old handwritten documents for our history museum website, and would find this quite useful. Dictation has gotten better over the years, and this would make it even better.
correcting the constant it’s its error that almost everyone makes?
hopefully they’ll make their odious emoji (絵文字) optional rather than so grotesquely in your face. that would improve accessibility/usability. for me anyway.
Sigh. No mention of the one accessibility feature I want, which I’ve mentioned here before and submitted feedback to Apple about: the ability to deny specific apps access to sound out. Games that automatically open a connection to Bluetooth audio output, even when volume in those games is set to zero, wreak havoc with the battery life of Bluetooth hearing aids.
I very much believe that just as with camera and microphone access, apps should have to ask for access to the speaker or Bluetooth audio output. It’s not just an accessibility issue—games that can’t be silenced annoy many of my friends as well—but it’s particularly significant to those using hearing-assistance devices.
I certainly hope that Apple keeps improving voice control and other dictation capabilities. Microsoft paid $16 billion for Nuance Dragon NaturallySpeaking a few years back, and it is in the process of screwing up the software. And over charging, of course. Better voice control and dictation capabilities will allow me, and I suspect others, to jump over to Apple devices full-time.
Me? I just want them to fix bugs, especially in the one accessibility feature I can’t live without, VoiceOver. Please. Pretty please. I can’t recommend Macs to potential new blind Mac users, as things stand, because the rot has got unacceptably bad. Apple gets a lot right, really, but accessibility isn’t just there for collecting PR brownie points, it’s about improving lives. And maybe Apple’s new glasses change the equation, but door detection and people detection aren’t really practical on an iPhone, particularly because it needs the lidar from the Pro models. So while I’m always intrigued to see what Apple’s doing and some of these features seem interesting, in truth the balance of probabilities puts a lot of it into the “glimpse into the future” bucket, and I think I’d rather they just stopped adding stuff, and fixed what’s already there.
It’s commendable how Apple is working on some seriously nifty ideas around A11Y, such as the Point and Speak feature in the Magnifier application. Then again, some of the features aimed at users with low vision essentially amplify visual clues about which items on screen are interactive, i.e., tappable. And it is those exact visual clues that were removed from the default iOS screens starting with iOS 7.
It’s true that accessibility features don’t just allow users with disabilities to use a product at all, but also (at least typically) improve usability for all users.
It’s well worth wondering, though, how much more accessible and usable Apple’s OSs would be by default these days, if the design powers that be in Cupertino hadn’t succumbed to Alan Dye’s misguided aesthetic preference to hide so many useful details from the GUI across Apple’s platforms.
I think iOS 6 and Snow Leopard were rockin’ great releases in no small part because of a strong emphasis on underlying foundational technology, including accessibility, yes. Generalised, well-thought-out accessibility came very much from the bottom up, rather than post-hoc accommodations. This is not to say it was perfect, of course—it wasn’t—but I think it notable that accessibility enjoys the benefit of solid technical foundation, which I think we can agree is less evident now. Also, Apple’s opinionated designs aren’t spared by the accessibility imperative—for a long time people with motor difficulties complained that something as simple as answering the phone by voice or automatically wasn’t allowed, and made the case for jailbreaking for that reason. In any event, I think there need to be higher-ups at Apple willing to advocate for accessibility, not merely as a technical requirement, but a human obligation. I don’t know what the situation is now, but notwithstanding the recent releases, my personal feeling is that there is work to be done.
Join the discussion in the TidBITS Discourse forum