- cross-posted to:
- [email protected]
- cross-posted to:
- [email protected]
the company says that Recall will be opt-in by default, so users will need to decide to turn it on
the company says that Recall will be opt-in by default, so users will need to decide to turn it on
The AI scans all those screenshots visually and tags them for search later so, for example, an artist could open a file they don’t remember the location of from thousands of folders by typing text describing it. That’s actually awesome. I imagine lots of people could come up with really useful ways to use something like that. I mean, if it wasn’t an Orwellian nightmare.
Features like this can almost never be privacy-friendly because they’re developed expressly to violate your privacy. The value it provides you , as cool as that could be, is just how it’s sold.
Yeah, it sounds like it might actually be a useful feature if it wasn’t impossible to do it securely and in a privacy respecting way.
I don’t know about impossible. I could see this working on a Linux distro with a local model doing all the work and storing it encrypted locally. Buuuuuut, it still feels risky! That’s a giant traunch of juicy, searchable data that just begs to be stolen.
To be fair to Microsoft, this was a local model too and encrypted (through Bitlocker). I just feel like the only way you could possibly even try to secure it would be to lock the user out of the data with some kind of separate storage and processing because anything the user can do can be done by malware run by the user. Even then, DRM and how it gets cracked has shown us that nothing like that is truly secure against motivated attackers. Since restricting a user’s access like that won’t happen and might not even be sufficient, it’s just way too risky.