Researchers in the UK claim to have translated the sound of laptop keystrokes into their corresponding letters with 95 percent accuracy in some cases.

That 95 percent figure was achieved with nothing but a nearby iPhone. Remote methods are just as dangerous: over Zoom, the accuracy of recorded keystrokes only dropped to 93 percent, while Skype calls were still 91.7 percent accurate.

In other words, this is a side channel attack with considerable accuracy, minimal technical requirements, and a ubiquitous data exfiltration point: Microphones, which are everywhere from our laptops, to our wrists, to the very rooms we work in.

  • paraphrand
    link
    English
    611 year ago

    Neat, so when my friends are taking about satisfyingly clackety keyboards I can inform them it’s a security hazard.

          • @Wilzax
            link
            English
            121 year ago

            ??? If you can map sound to qwerty keystroke placement, then it’s a simple matter of mono alphabetic substitution for other layouts to generate candidate texts. Using a dictionary attack to find more candidate layouts would absolutely work.

            • @[email protected]
              link
              fedilink
              English
              71 year ago

              No, all the timings change. You can’t just swap out the letters and hope it matches. Additionally I was responding to the poster claiming a dictionary attack on a password would work - only if it’s in the dictionary.

              • @Wilzax
                link
                English
                61 year ago

                The method is not based on timings. It is based on identifying the unique sound profile of each keystroke

                • @[email protected]
                  link
                  fedilink
                  English
                  51 year ago

                  How can you make that claim? They used deep learning, does anyone know what characteristics the AI is using?

                  • @Wilzax
                    link
                    English
                    41 year ago

                    I can only make that claim with as much confidence as yours that a non-standard keyboard layout would protect you. By your own argument, how can you make the claim that using a different keyboard layout will protect you? Everyone has a different typing style and their keyboards have different sound profiles, so clearly the AI needs to sample enough data from you and compare it to a dictionary to individually on audio collected of your typing. How do you know that the characteristics of the keyboard layout play an identifying role in this kind of attack when the AI has no concept of which sounds correspond to which keys until it has done its training?

                  • @rhandyrhoads
                    link
                    English
                    21 year ago

                    A pretty simple deep learning approach would be to take a large sample and first identify the individual key sounds. From there it can start associating the most common letters with the most common sounds and switch it around until dictionary words start coming out. Once it can identify individual keys you could even brute force it in a pretty reasonable timeframe. The keyboard layout is the least important part because the individual key sound output is going to vary keyboard by keyboard and even potentially user by user. If you used a password without dictionary words and used a different keyboard layout exclusively for entering the password that would likely defeat this sort of attack.

    • @Duamerthrax
      link
      English
      41 year ago

      Good luck making an acoustic map of the tens thousands of possible case, switch and key cap combinations.

    • @Evotech
      link
      English
      41 year ago

      Middle management will finally get rid of clacky keyboards with this weird trick