Ahead of World Accessibility Awareness Day on Thursday, May 19, Apple announced several upcoming features to make life easier for users with disabilities. Embedded innovations that are intended to be simple to use, in particular to guide visually impaired people to their destination with now door detection.

Accessibility is more than ever an important value within Apple, which even considers this to be “a basic human right“. For more than 35 years, there has even been a team responsible for innovation for people with disabilities within the firm and for designing tools that will integrate iPhones, iPads and other Macs or Watches to make life easier for million users.
Building with people with disabilities and not for them has been the apple brand’s mantra for decades. “We prefer that it take time to design so that the accessibility tools are ready and easy to use for a user when the product opens“, we are told on the side of Cupertino. Each year therefore brings its share of new features in Apple products and Accessibility Day, which is held on May 19, is often the occasion to introduce new functions.
Guiding until the last steps
Before their launch in the coming months, most likely with iOS 16 which will be presented in June at the WWDC developer conference, Apple has unveiled several functions that will make life easier for people who are visually impaired, hard of hearing or have mobility problems. Innovations that will not be reserved for disabled users only, but for all owners of Apple devices, as is often the case in the long term.
Relying on the capabilities of its A chips, machine learning and its Neural Engine, Apple is notably strengthening its Magnifier application which is an important support for blind and visually impaired people by enlarging an object or describing an image.

At the end of 2020, it was equipped with the Person detection function which notably allows you to know the distance separating you from the person in front of you, very useful especially in queues. It relies on the capacity of the LiDAR sensor on the back of the iPhone 12 Pro and 13 Pro, but also the latest iPad Pro which are equipped with it, to indicate distance and direction vocally and on the screen.
The app will be enriched in the coming weeks with a Door Detection function. Based on the same principle, it allows, by walking the device in front of you, to know how far a door is located, its color, its material, its shape and even the type of handle installed, thus allowing you to know how to open it. .

The strength of the Magnifier app is thus to be able to combine all of these functions (person detection, door detection and image description) to make it easier for a visually impaired person to find their way to a store, for example. . She will then know exactly where the access door is located, any written information, and will be guided in her navigation. It will also be informed whether the door is open or closed and how to proceed (even for bay window type sliding doors). Apple specifies that it will be possible to define certain situations and the information to be provided by the app.

To combine the talents of its applications, Apple adds that Maps will also be usable with Loupe and will offer haptic and sound feedback if you use VoiceOver to identify the starting point of the route taken on foot.
How to use the Magnifier app?
- Go to Settings/Accessibility/Magnifier
- Enable Magnifier
- The magnifying glass icon appears on the home, most often next to the camera
- Open the app to use it.

- A simplified camera menu appears. Among the icons, we currently find person detection on the right. This is where door detection will take place.
- Point the device in front of you.
- The toothed wheel then allows you to adjust the parameters according to your needs (sound, voice, vibrations).
> Find all our articles on accessibility in tech
To follow us, we invite you to download our Android and iOS application. You can read our articles, files, and watch our latest YouTube videos.