Meet EyeTell, the next threat to touch screen passwords
Above: Photo courtesy of Unsplash
If you’re smart, you change your passwords every six months and avoid using “password123” to secure your information. That should be enough to protect you, right? It might not be, say Arizona State University researchers trying to stay on top of the latest cybersecurity threats.
Passwords are the keys to our most personal information, and hackers are constantly coming up with unique ways to unmask them. They can install special software. The more ambitious hackers watch the hand movements of a victim typing their password or analyze the surrounding reflections, like in eyeglasses or nearby windows. Now, there’s a new, stealthier method that could make using passwords on a touch screen even more vulnerable.
When you type in your passcode to unlock your smartphone, or any touch screen, your eyes naturally follow your fingers’ movements, and hackers can use this to their advantage.
Using this new tactic, hackers can record the victim and extract a “gaze trace” of where their eyes are moving across the screen. With less than one minute of video analysis, the program can decipher your passcode, PIN or lock pattern based solely on where your eyes moved.
This is EyeTell – a digital tool that can decipher passwords based only on where your eyes move – and it’s the next potential threat to mobile users’ cybersecurity.
Luckily enough, ASU researchers anticipated this style of potential security breach before it could be deployed on a wide scale.
“As cybersecurity researchers, our main goal is to defend against various hackers,” said Yanchao Zhang, a professor of electrical engineering in the School of Electrical, Computer and Energy Engineering in ASU’s Ira A. Fulton Schools of Engineering, whose team developed the software. “We have to stand in the shoes of the hackers and report attacks to spread awareness.”
Zhang and his team deliberately sought out this password-theft tactic to research the threat. Zhang worked with Yimin Chen and Tao Li, graduate students from his Cyber and Network Security Group; Terri Hedgpeth, director of ASU’s Educational Outreach and Student Services Technology Team; and Rui Zhang, University of Delaware assistant professor.
Often when a hacker finds a hole in existing cybersecurity measures, experts scramble to find a way to fix it. More recently, however, information security researchers have started to take a more proactive stance, seeking out cybersecurity vulnerabilities themselves. This approach affords them more time to address problems before they are abused.
The team’s conference paper, “EyeTell: Video-assisted touch-screen keystroke inference from eye movements,” was presented at the IEEE Symposium on Security and Privacy this summer. This publication venue for cybersecurity research has an acceptance rate for papers of only around 11 percent with 63 selected out of 549 submitted in 2018.
“Keystroke inference is a critical threat against computer and mobile devices,” Zhang said. “We demonstrated the efficacy of our technique through comprehensive experiments on both iOS and Android devices.”
During Zhang’s research, the team found a few more mundane ways to discourage success with EyeTell and other similar video frameworks that hackers could use to steal password information: wearing colored eyeglasses, typing without looking at the screen and increasing your typing speed.
The next step for the team is more research. Now that experts know about the potential risk, they’re one step closer to finding technical solutions to curb the threat.
“Mobile touch-screen devices, such as tablets and smartphones, have penetrated into everyday life,” Zhang said. “Cyberattacks are also becoming more and more advanced. Our research can inspire new research methods to identify security risks associated with mobile touch-screen devices and develop effective countermeasures.”