A new study by a team of British researchers shows that artificial intelligence models can identify with very high accuracy what users type into a computer (such as a password) by listening to and analyzing the sound of typing on a keyboard.
The study, published on the sidelines of the IEEE (Institute of Electrical and Electronics Engineers) European Symposium on Security and Privacy, warns that the technology poses a significant threat to user security. Because they can steal data through the microphones built into the electronic devices we use throughout the day.
But how does this technology work, what are the risks and how can they be minimized?
First: How does this technology work?
Researchers have developed an artificial intelligence model that can recognize the sound of typing on the keyboard of an Apple MacBook Pro computer. After training the model on keystrokes recorded by nearby cell phones, it was able to determine which key was pressed with up to 95% accuracy just based on the sound of the key.
The researchers found that when the speech classification algorithm was trained using speech collected by computers during Zoom calls, prediction accuracy dropped to 93%, an alarmingly high percentage considered a record for the method.
The researchers collected training data by pressing 36 keys on a computer keyboard (MacBook Pro) 25 times with different fingers and pressure, then recorded the sound produced by each keystroke using a smartphone near the keyboard or via a Zoom call. . This is done on the computer.
They then created waveform and spectral images from the recordings that showed distinct differences for each key, and took steps to process the data to improve the signal that could be used to determine the sound of the keys.
After testing the model with this data, they found that it was able to extract correct keystrokes from smartphone recordings at 95% accuracy, from Zoom call recordings at 93%, and from Skype call recordings at 91.7%, which is lower but still very good. . loud and annoying.
Second, what are the risks of this technology?
Researchers said that with the increasing use of video conferencing tools such as Zoom, the proliferation of devices with built-in microphones, and the rapid development of artificial intelligence technology, these attacks can collect large amounts of user data such as passwords and chat content. He can easily access messages and other confidential information.
Unlike other side-channel attacks, which require special conditions and are limited by data rates and distances, acoustic attacks are much easier due to the abundance of devices with microphones capable of high-quality recording, especially given the rapid development of audio technology. . machine learning.
Of course, this is not the first study targeting voice cyber attacks, as several studies have shown how vulnerabilities in the microphones of smart devices and voice assistants such as Alexa, Siri, and (Google Assistant) Google Assistant can be linked to the attack. However, the real danger lies in the accuracy of the AI models.
The researchers said that the models they used in their study used sophisticated methods called artificial intelligence models, and achieved the highest level of accuracy to date, and that these attacks and patterns will change over time. A little more specific.
“These attacks and patterns will become more specific over time, and as smart devices with microphones become more common in homes, there is an urgent need to publicly discuss how to regulate AI.”
Third, how can the risks of this technology be minimized?
The researchers recommend that users affected by these attacks change the password's spelling pattern, for example: use the Shift key to create a mix of uppercase and lowercase letters with numbers and symbols to avoid typing the full password.
They also recommend using biometric authentication or using a password manager app so you don't have to enter sensitive information manually.
Other potential countermeasures include using software to reproduce key sounds, or using white noise to distort the sound of keyboard keys being pressed.
In addition to the mechanism proposed by the researchers, a Zoom spokesperson told BleepingComputer about the study, advising users to manually adjust the background noise isolation feature in the Zoom app to reduce its intensity. Mute your microphone by default when you join a meeting. And mute your microphone while typing during meetings to reduce noise. Help protect their information. such attacks.