The clicking of your keyboard may be enough for AI to steal your data, even over a video call

AI trained on keyboard audio alone was 95% accurate at predicting what the user typed.

When you purchase through links on our site, we may earn an affiliate commission.Here’s how it works.

What you need to know

What you need to know

The future of AI is now, and it’s bringing with it some really weird cyberattacks. A team of researchers at Cornell University haverecently published a studydetailing their hypothetical cyberattack that involved training AI to recognize a user’s input based on the audio of their keystrokes. The process of using audio and sonic surveillance to scrape data is known as an “acoustic side channel attack”, and while the process of using audio to steal sensitive information is not new, the idea of pairing it with AI is a leap in the technology that makes it much more efficient.

According to the research team behind the project the attack could easily use everyday technology like a cell phone microphone orZoomrecordings to acquire the training audio that is then fed into theAI algorithmto analyze the sound before translating it into readable text. With AI that is properly trained on the keyboard being used the model was capable of predicting what the user had typed with 95% accuracy, though this did drop to 93% when using Zoom recordings to train the AI.

-Microsoft celebrates six months of Bing Chat-Windows 11 drops support for 44 Intel 8th-Gen chips-Zoom says it’ll use customer data to train its AI models-Zoom tells some workers to return to the office

To create the hypothetical cyberattack, the research team pressed 36 keys on a MacBook Pro 25 times each with varied amounts of pressure while recording the sound of each keystroke. The 25 audio samples were then combined into one file for each key and fed to an AI algorithm.

You don’t need to throw out yourfavorite mechanical keyboardjust yet. There were ways to thwart the AI, including adding in extra keystroke sounds when possible or using third-party software to produce a noise that could muddy up any audio that would be potentially used to train AI. Using varied text cases and randomizing your typing style could also help, though changing up something like your very manner of typing may be easier said than done.

Swapping tobiometric protectionsfor your data is certainly going to be an easier method for most users. Those who are genuinely concerned about an acoustic side channel attack could opt to use atouch screen keyboardas the study showed that swapping to a touch screen keyboard could lower the accuracy of the AI to as little as 40%.

Analysis: Not your biggest security concern

Analysis: Not your biggest security concern

It’s always a good idea to be in the know when it comes to cybersecurity measures, but when it comes to something like acoustic keyboard attacks it’s very unlikely the average typist is going to need to be concerned. If, of course, you’re dealing with highly sensitive information you may want to be a little more cautious than the average user, but there are certainly easier methods of data scraping out there than going through the hassle to train an AI with a person’s specific keyboard and typing behavior in order to steal data.

Get the Windows Central Newsletter

All the latest news, reviews, and guides for Windows and Xbox diehards.

Cole is the resident Call of Duty know-it-all and indie game enthusiast for Windows Central. She’s a lifelong artist with two decades of experience in digital painting, and she will happily talk your ear off about budget pen displays.