This new feature was discovered by Mike Rundle and quickly posted to Twitter. He noted that the third developer beta for iOS 13 includes a feature called “FaceTime Attention Correction”. This feature will use augmented reality to make it seem like you are looking into the camera while you FaceTime.
How is the feat being achieved? With ARKit, Apple is able to map the depth and features of a user’s face and reposition their eyes accordingly. The correction even seems to work if a user is wearing glasses or sunglasses.
There are a few questions that need to be answered before iOS 13 is released this autumn. First, will this feature be able to function if there are multiple people in one shot? Although the new update appears to work quite well, will it spawn glitches that are straight from a horror film? No one wants to end up looking like the Pale Man from Pan’s Labyrinth.
Guys - "FaceTime Attention Correction" in iOS 13 beta 3 is wild.
— Will Sigmon (@WSig) July 2, 2019
Second, will this feature will be available on all Apple devices? So far it appears that the FaceTime Attention Correction only works on the iPhone XS and iPhone XS Max. Some have theorized that the feature is only possible because of the devices’ A12 processor.
It is important to note this feature is currently only available in the developer beta. It could be updated or even deleted entirely before iOS 13 is fully released to the public. It would be great if Apple were able to keep the feature and perhaps inspire other companies to do the same. It is a small but meaningful correction.
Apple also recently fixed a FaceTime bug with their iOS 12.1.4 update. teenager, Grant Thompson discovered a bug that allowed anyone to eavesdrop on your iPhone's microphone by initiating a FaceTime call. The update fixed the bug and Thompson even received a bounty. Let’s hope that iOS 13 does not contain any similar bugs.