Apple has announced a new set of accessibility features built to help people with cognitive disabilities. The new features, which could be part of iOS 17 and will arrive later this year, include Assistive Access, Live Speech, Personal Voice, Point and Speak in Magnifier, along with couple of additional features.
“At Apple, we’ve always believed that the best technology is technology built for everyone,” Apple CEO Tim Cook said at the announcement. “Today, we’re excited to share incredible new features that build on our long history of making technology accessible, so that everyone has the opportunity to create, communicate, and do what they love.”
ALSO READ l Apple may have leaked its mixed reality headset operating system ahead of potential WWDC 2023 launch
Realme 11 Pro Plus and 11 Pro launched in India with Dimensity 7050, up to 200MP camera: Full details
Apple’s Personal Voice feature has been designed especially for people who are at risk of losing their ability to speak by letting them create “a synthesized voice that sounds like them” to talk with friends or family members.
The company explains that users can create a Personal Voice by reading a set of text prompts for 15 minutes of audio on iPhone or iPad. “This speech accessibility feature uses on-device machine learning to keep users’ information private and secure, and integrates seamlessly with Live Speech so users can speak with their Personal Voice when connecting with loved ones.”
The Assistive Access is another new feature that allows users to customise their iPhone or iPad experience to make it easier to use. The feature offers a distinct interface with high contrast buttons and large text labels, as well as tools to help customise the interface. “For example, for users who prefer communicating visually, Messages includes an emoji-only keyboard and the option to record a video message to share with loved ones. Users and trusted supporters can also choose between a more visual, grid-based layout for their Home Screen and apps, or a row-based layout for users who prefer text.”
The Live Speech feature allows users to hear text being spoken as they interact with their device. With Live Speech, users can type what they want to say to have it be spoken out loud during phone and FaceTime calls as well as in-person conversations.
Apple has introduced Point and Speak feature in Magnifier to help users with visual impairments. For example when interacting with a home appliance, Point and Speak uses the camera, LiDAR Scanner, and on-device machine learning to audibly announce the text present on each button as users slide their finger across the keypad. This feature is integrated into the Magnifier app on iPhone and iPad, seamlessly integrates with VoiceOver, and can be combined with other Magnifier functionalities such as People Detection, Door Detection, and Image Descriptions, enabling users to navigate their physical surroundings more effectively.
ALSO READ l Apple announces Final Cut Pro, Logic Pro for iPad ahead of WWDC 2023: Why it’s a big deal
Apple has also added couple of additional features for various disabilities across Apple devices. Deaf or hard-of-hearing users can now connect their Made for iPhone hearing devices directly to Mac and personalise them for their hearing comfort. There’s Voice Control that offers phonetic suggestions for text editing, assisting users who type with their voice to select the correct word from similar-sounding options. Switch Control that enables users with physical and motor disabilities to transform any switch into a virtual game controller, allowing them to enjoy their favourite games on iPhone and iPad. Adjusting Text Size across various Mac apps has been made more user-friendly for individuals with low vision. Users sensitive to rapid animations can have moving elements, such as GIFs, automatically paused in Messages and Safari.