New iPhone to improve facial recognition feature so it can see owner better, report claims

Latest technology could add other features to 2019 phone, like 3D scans of people's face

Andrew Griffin
Monday 05 November 2018 13:57 GMT
Comments
A customer sets up Face ID on his new iPhone X at the Apple Store Union Square on November 3, 2017, in San Francisco, California
A customer sets up Face ID on his new iPhone X at the Apple Store Union Square on November 3, 2017, in San Francisco, California (ELIJAH NOUVELAGE/AFP/Getty Images)

Your support helps us to tell the story

From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.

At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.

The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.

Your support makes all the difference.

Apple's next iPhone could bring important updates to its flagship feature, according to a new rumour.

The phone could vastly improve the Face ID facial recognition that sits in the top of the handset.

New technology will allow the invisible lights that are used as part of the system to illuminate people's face far better, allowing it to recognise its owners more quickly, according to a report from reliable Apple analyst Ming-chi Kuo.

Face ID was first introduced in the iPhone X, last year. It works by throwing out an array of 30,000 infrared lights, which hit the holder's face – a sensor in the camera can then see where they fall and check whether the right person is holding the phone.

But when the light is especially bright – on a sunny day, for instance – those invisible lights within the phone can get thrown off and the facial recognition sensor can have trouble seeing its owner.

The new feature will improve that by better illuminating people's faces. The new sensor and its better lights will "lower the impacts from visible lights of environment in order to improve the Face ID user experience", according to the report first spotted by Macrumors.

In addition to those new features, the updated version of the Face ID sensor could also allow it to use "time of flight" calculations to make 3D models of the things it is looking at. That technology works by throwing out a piece of light and seeing how long it takes to bounce back, and then using that to work out the distance – similar to something like radar.

If Apple were to integrate that, it would allow the phone to use its modelling software to create augmented reality features to improve tools like the Animoji feature, for instance.

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in