this post was submitted on 20 Sep 2023
530 points (98.4% liked)

Technology

34437 readers
238 users here now

This is the official technology community of Lemmy.ml for all news related to creation and use of technology, and to facilitate civil, meaningful discussion around it.


Ask in DM before posting product reviews or ads. All such posts otherwise are subject to removal.


Rules:

1: All Lemmy rules apply

2: Do not post low effort posts

3: NEVER post naziped*gore stuff

4: Always post article URLs or their archived version URLs as sources, NOT screenshots. Help the blind users.

5: personal rants of Big Tech CEOs like Elon Musk are unwelcome (does not include posts about their companies affecting wide range of people)

6: no advertisement posts unless verified as legitimate and non-exploitative/non-consumerist

7: crypto related posts, unless essential, are disallowed

founded 5 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] cobra89@beehaw.org 6 points 1 year ago (1 children)

Um how exactly do you think these "rogue devices" would exfiltrate that data? Do you think iOS is providing Internet access to the faceID module or the display? Or do you think these devices somehow contain an entire wifi chipset to connect to the Internet to exfiltrate your data without anyone noticing an entire extra SoC soldered onto the part?

Please provide any argument as to why you think these could exfiltrate data over these interfaces? Unless you think iOS's security is so poor that it lets any hardware device that's attached to it get full network access? (Which I'm pretty sure is not physically even possible in most cases since those connectors are only capable of sending the type of data across for that particular sensor.)

[–] kitonthenet@kbin.social 0 points 1 year ago (1 children)

To exfiltrate the login password from a keylogger on a macbook, for example, you need to have some software running on the cpu as well as the keyboard itself. This makes it very difficult to do in reality, as you have to infect both devices and if you do not have physical access, your exploit needs to be done across the keyboard interface, which makes it very hard to do in practice. Swapping any random keyboard in that could potentially be malicious introduces two issues, as now the keyboard itself may have a keylogger, as well as opening the possibility of exploiting some vulnerability in the cpu from the keyboard itself. You therefore open two attack surfaces that were previously closed, which is highly significant.

[–] Zangoose@lemmy.one 2 points 1 year ago (1 children)

If you think keyloggers require software running on your physical keyboards you're in for a rude awakening.

Keyloggers are almost always at a pure software level and are conceptually simple to make. So simple that in fact, it's the same thing as running a regular application with background shortcuts. The only thing that is different is that regular apps aren't saving/recording anything, they're just listening for you to press cmd+whatever.

It takes maybe ~10-15 minutes to make a keylogger in Python that could run on any computer, mac, windows, or Linux. Maybe a little longer if you wanted to use a compiled language and properly hide it.

Sorry to burst your bubble.

  • A software developer
[–] macaroni1556@lemmy.ca 2 points 11 months ago (1 children)

And what does that have to do with the risk of a screen repair?

I can also install a key logger on Linux and I can also freely change the SSD to anything I buy on the internet.

And yet somehow people still use computers!? Madness.

[–] Zangoose@lemmy.one 2 points 11 months ago (1 children)

I think we're on the same page? If an attacker wanted a keylogger they wouldn't even need to go as far as a screen, there are plenty of other ways (like a 3rd party keyboard app) that would work just as well, if not better, on an iPhone.

Hell, while we're at it, using a phishing email to get you to enter a password in a fake site or using social engineering to reset your passwords is way more effective than reverse engineering and modding a camera/screen.

There's no reason why Apple should get to keep exclusive rights on repairs just to profit more on parts. 3rd party screens, cameras, face id modules, etc. aren't going to suddenly make your phone less secure.

[–] macaroni1556@lemmy.ca 2 points 11 months ago

Ok, agreed we are on the same page! My misunderstanding.

(I thought you were defending the idea a keylogger is a risk not worth taking with a screen replacement, somehow.)