New iPhone tech lets people who are blind or low-vision ‘see’ when others are coming
Apple’s newest iPhone models are expected to be a huge hit this year. Not only do they have the typical upgrades like a better camera and faster processor but the new iPhones also come with 5G capabilities.
Conspiracy theorists aside, we’ve all been waiting for 5G technology to become a reality for quite a while and it’s finally here. Tap or click here to find out what you should know before buying an iPhone 12.
But 5G is only the beginning. An upcoming update to iOS brings a handful of impressive new features and a ton of AR functionality. One such feature helps people with vision problems detect people and objects physically near them. Let’s dive into how it works and what it means to the visually impaired.
The Magnifier app gets an impressive update
Nestled within the Magnifier app in iOS 14.2 beta is a feature called People Detection. It uses augmented reality (AR), machine learning and iPhone 12’s LiDAR sensor to detect where humans and objects are in physical space. This feature can tell you how many people are in line at a store, locate an empty table or simply tell you how close you are to the nearest person.
In the age of COVID-19, this feature is the ultimate social distancing app. Using People Detection, you can set a minimum distance for alerts, say six feet, and be alerted by notification or haptic feedback whenever someone comes within that distance. This takes the hassle out of constantly orienting yourself in public when your phone can simply do this for you.
Your iPhone can do a lot for you if you have the know-how to set it up. Tap or click for 10 new iPhone tricks you’ll use all the time.
What’s the purpose of People Detection?
People Detection was not built with the pandemic in mind, it was designed as an accessibility option. It’s meant to help those who are blind or otherwise visually impaired navigate complex, busy public spaces.
Not only can you choose between haptic, visual, or audio feedback for People Detection notifications, but it’s also compatible with AirPods and Apple’s screen-reader technology, VoiceOver. People Detection does not work in the dark when your phone’s LiDAR sensor cannot provide useful information.
As of now, People Detection is in beta testing but it’s supposed to be released with the next operating system update, iOS 14.2.
Forgetting about iOS 14.2, iOS 14 itself brought with it a ton of huge, new features. Tap or click here to see how you can now change the default browser and email app on your iPhone.
Tags: accessibility, Apple, Apple AirPods, Apple iPhone, Apple iPhone 12, augmented reality, COVID-19, features, functionality, machine learning, operating systems, pandemic, social distancing, technology, upgrades