Oh Dear: The Sound of Your Keystrokes Could Leave You Wide-open to a Cyberattack

A simple video call could open your laptop to hackers, experts warn…

University researchers have created an AI system to decipher words from the sound of typing — with more than 90 percent accuracy.


That remarkable achievement comes with an obvious downside: it means that just typing your password while chatting over a video platform like Zoom could open the door to a cyberattack.

Industry experts say that as video conferencing tools have grown in use, and devices with built-in microphones become ubiquitous, the threat is real, and rising.

Researchers have created a system that can work out which keys are being pressed on a laptop keyboard, according to a report in The Guardian.

“I can only see the accuracy of such models, and such attacks, increasing,” said Ehsan Toreini, doctor and co-author of the study at the University of Surrey. The research was published as part of the IEEE European Symposium on Security and Privacy Workshops.

Toreini and colleagues used machine learning algorithms to identify pressed keys based on sound alone. It’s an approach that researchers recently deployed on the Enigma cipher device.

The process began by pressing each of 36 keys on a MacBook Pro, including all letters and numbers, 25 times in a row, using different fingers and with varying pressure. The sounds were recorded over a Zoom call and on a smartphone near the laptop.

Suid Adeyanju, CEO of IT firm RiverSafe, said it was “a wake-up call about the true risks” posed by artificial intelligence. “Far too many organisations are rushing to adopt the technology without conducting even the most basic due diligence tests,” he said.

“Over-enthusiastic executives should take note that AI may look like Barbie, but it could turn out to be Oppenheimer if the necessary cyber protections and regulatory procedures aren’t in place.”

While it is not clear which clues the system used to identify specific keystrokes, Joshua Harrison, first author of the study, from Durham University, said proximity of the keys to the edge of the keyboard could be a factor. “This positional information could be the main driver behind the different sounds,” he said.

The system was then tested on other data. It proved accurate 95 percent of the time when the recording was made over a phone call, and 93 percent of the time during a Zoom call.

The study was co-authored by a doctor by from the Royal Holloway, University of London, Maryam Mehrnezhad. The researchers say the work is a proof-of-principle study, and has not been used to crack passwords. But they warn of a need for vigilance, and say using a laptop in public places could present a risk.

The risk of such acoustic “side channel attacks” can be mitigated, experts say, by opting for biometric passwords or activating two-step verification. Failing that, it’s a good idea to use the shift key and create a password using upper and lower cases, numbers, and symbols. “It’s very hard to work out when someone lets go of a shift key,” said Harrison.

Feng Hao, a professor from the University of Warwick, said people shouldn’t type sensitive messages, including passwords, during a Zoom call. “Besides the sound,” he said, “the visual images about the subtle movements of shoulder and wrist can reveal side-channel information about the keys being typed, on the keyboard even though the keyboard is not visible from the camera.”

You may have an interest in also reading…

Middle East – Business Trumps Politics

Joint business ventures, and the pragmatism required for success, laid the groundwork for the normalisation of relations between Israel and

CBRE: Tech Adoption Accelerating Across Real Estate Industry

The expanding deployment of technology, from connectivity and hardware upgrades to machine learning and AI, means that almost every feature

Orban on Orbán: Cease and Desist, Your Position Is Untenable

Europe’s other ‘Orban’ sounds much more reasonable than the vociferous original one. Prime Minister Ludovic Orban of Romania yesterday chastised