this post was submitted on 02 Dec 2023
399 points (95.2% liked)
Technology
59174 readers
974 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I wonder if you need to train it on a specific keyboard before it will work it.
Most likely
That would limit the practicallity quite a lot, as deskmats and typing style would change the sound of even a common keyboard.
I also notice that I slightly change my typing style between typing normally and entering my password.
Eh... I don't know if it would be enough of a change. Also consider mass produced popular laptops (e.g. targeting the MacBook keyboard).
I don't really think that's normal... But hey, maybe it gives you some protection 🙂
I doubt so. Wouldn't Zipf's law be used for this?
In the article it says that they had to record data from the keyboard ahead of time to plug into the model. This isn’t some magic thing that can listen to any keyboard it has to be trained for one keyboard at a time. It sounds like it’s just a proof that this technique could be used if a malicious actor could get physical access briefly to record the training data - then from that point on all you need is a microphone listening.