iOS 14 has already given us some really cool accessibility features. People Detection, however, is on another level. This feature is only available on iPhone 12 Pro, iPhone 12 Pro Max and certain iPads. But it shows a serious commitment to pushing the boundaries of a smartphone as an accessibility device.
What is People Detection?
People Detection is… what it sounds like. It uses advanced Machine Learning technology and Apple’s new LiDAR scanner to identify nearby people and help users maintain a safe distance. The camera utilizes “People Occlusion” in ARKit to identify if there are any people in the frame. And then the LiDAR sensor scans the distance between them and the user in realtime.
Users can customize the output to best fit their needs. They can choose between multiple audio output options, haptic output, or a visual readout. Users can also select any combination of these available options.
What is People Detection Used For?
Much like the Apple Watch Series 6’s Blood Oxygen scan, this feature is being marketed toward keeping users safe from Covid-19. Users with low vision can use this feature to help them maintain social distancing protocols while navigating public spaces. Beyond that, though, really anyone could utilize this feature to ensure that they are accurately adhering to social distancing guidelines. After all, the LiDAR scanner’s 6ft will be much more accurate than our own perception of 6ft.
Beyond this, though, there are plenty of other uses for this accessibility feature. The way People Detection dynamically tracks people in the space surrounding a user makes it a hugely valuable tool for low vision people navigating public spaces, such as grocery stores or public transit.
Looking Toward the Future.
Apple has opened up this technology to developers. Someone could utilize this technology to create an app that instead detects cars, for example. One thing is definitely clear, though. AR has huge potential when it comes to accessibility. And it’s great to see big companies like Apple working toward utilizing it for those applications. It will indeed be very exciting to see how developers use this technology moving forward.