iOS 14 will allow us to interact with the iPhone without touching the display

With iOS 14 and macOS Big Sur, developers will be able to add the ability to detect the human body and the poses of the hands in the photos and video in their apps using the framework Vision up-to-date Apple.

This feature will allow the app to analyze the poses, the movements and gestures of the people, allowing a wide variety of potential features. Apple provides some examples, including a fitness apps that can automatically track the physical activity performed by a user, an app for safety training that may help employees use the poses correct, and an app for the editing of the media can find photos or video-based pose similar

The detection of the laying of the hands, in particular, promises to offer a new form of interaction with the app. In the demonstration, Apple showed a person that holds together the thumb and index finger and draw in an iPhone app without touching the display.

In addition, the app may use the framework to overlay emoji or graphics on the hands of a user that reflects a specific act, as a sign of peace.

Another example is an app in the camera that automatically activates the image capture when it detects the user performing a specific act with the hand-

The framework is able to detect multiple hands, or objects in a scene, but the algorithms may not work as well with people wearing gloves, they bend over, are upside-down, or wear clothes similar to the clothes. The algorithm may also encounter difficulties if a person is close to the edge of the screen or partially clogged.

Similar features are already available via the ARKit, but are limited to the sessions of augmented reality and will only work with the rear camera on compatible models of the iPhone and iPad. With the framework Vision up-to-date, the developers have many more options.