Apple Vision Pro eye-tracking used to expose text input

Hotstar in UAE
Hotstar in UAE

Researchers exposed what users typed when they relied on the eye-tracking technology used in Apple Vision Pro. A group of researchers managed to decipher what people entered on the device’s virtual keyboard while using virtual avatars.

Two Apple Vision Pro features exposed what users typed

Apple launched the Vision Pro earlier this year. It is a revolutionary “spatial computer” that allows users to perform nearly every task without any external, handheld controllers.

One of the features Apple Vision Pro has is “eye tracking”. As the name implies, it tracks eye movements to judge users’ intentions and actions. Among other capabilities, the eye-tracking feature allows users to type on a virtual keyboard.

Apple Vision Pro users rely on avatars when they are on Zoom calls, Teams, Slack, Reddit, Tinder, Twitter, Skype, and FaceTime. A group of researchers reportedly managed to gather information about what the users were typing using the Apple Vision Pro’s virtual keyboard using these two features to expose sensitive information.

Shared with Wired, the GAZEploit exploits eye-tracking data to guess the passwords, PINs, and other text people typed, with a concerning degree of accuracy. Specifically speaking, the exploit observes what these virtual avatars are doing and where they are looking. It uses this data to guess sensitive data.

Apple Vision Pro eye tracking won’t expose text input?

It is important to note that the Apple Vision Pro’s hardware and software are not compromised. In other words, the researchers did not gain access to Apple’s headset. Hence, they couldn’t see what users were viewing.

According to Hanqiu Wang, one of the leading researchers involved in the work, GAZEPloit tries to guess the characters users’ avatars are typing on the VR headset.

“Based on the direction of the eye movement, the hacker can determine which key the victim is now typing. [GAZEPloit] identified the correct letters people typed in passwords 77 percent of the time within five guesses and 92 percent of the time in messages.”

The researchers have confirmed that they alerted Apple about the vulnerability in April. The iPhone maker tagged the vulnerability as CVE-2024-40865 and issued a patch at the end of July. Essentially, Apple Vision Pro now stops the sharing of an avatar if a user is typing on the virtual keyboard.

The researchers trained a recurrent neural network (Machine Learning) using recordings of just 30 people’s avatars. Users need not worry about GAZEPloit as it involves security researchers. However, this exploit exposes just how predictable eye movements are, especially when users focus on entering sensitive information.

2024-09-13 15:08:39

Leave a Comment