- cross-posted to:
- [email protected]
- [email protected]
- cross-posted to:
- [email protected]
- [email protected]
E2E encrypted communication can be used for nefarious things, that’s a fact. But it’s something that needs to be standardized because the accessibility of any and all private communication or information to so few individuals can be used so much more nefariously. Really wish people were more concerned about data privacy. It’s not about how your data will be use against you… It’s about how OUR data is used against US.
Good. Despite all the mistakes they make, at least Apple seems to be willing to learn from some of ‘em and stand up for their users (even if only a little).
I actually don’t think this has anything to do with standing up for their users but is a simple cost/benefit analysis: building compromised E2E-communication that is still reasonably secure against bad actors is much more difficult (if not impossible) than building robust E2E-communication. Apple just doesn’t want to lose business users over headlines like „iOS messaging used by Chinese spies to steal US trade secrets“, while headlines about how difficult it is for government agencies to unlock iPhones probably drive sales. Nothing morally or ethical here, only profits.
I mean, it’s still standing up for their users even though it’s profit-driven.
They just launched a whole ad campaign based around imessages encryption as well… not supporting would be a bad look and a waste of ad dollars
lol agreed, plus the whole CSAM mess that they can help bury with this.
Yep. Honestly, if there’s a good profit motive to do the right thing, I trust companies far more to actually do the right thing. We WANT there to be profit in doing the right thing. When there isn’t, they don’t.
Exactly. Greed is far more predictable and transparent than morality.
Didn’t Apple try to introduce this and got a ton of flak from all sorts of privacy “experts”? They then scrapped their plans, did they not? How is this any better/different? Any sort of “backdoor” into encryption means that the encryption is compromised. They tackled this in 2014 in the US. Feels like deja vu all over again.
@generalpotato Ish. I read the technical write up and they actually came up with a very clever privacy-focused way of scanning for child porn.
First, only photos were scanned and only if they were stored in iCloud.
Then, only cryptographic hashes of the photos were collected.
Those hashes were grepped for other cryptographic hashes of known child porn images, images which had to be in databases of multiple non-governmental organizations; so, if an image was only in the database of, say, the National Center For Missing And Exploited Children or only in the database of China’s equivalent, its cryptographic hash couldn’t be used. This requirement would make it harder for a dictator to slip in a hash to look for dissidents by making it substantially more difficult to get an image in enough databases.
Even then, an Apple employee would have to verify actual child porn was being stored in iCloud only after 20 separate images were flagged. (The odds any innocent person even makes it to this stage incorrectly was estimated to be something like one false positive a year, I think, because of all of the safeguards Apple had.)
Only after an Apple employee confirmed the existence of child porn would the iCloud account be frozen and the relevant non-government organizations alerted.
Honestly, I have a better chance of getting a handjob from Natalie Portman in the next 24 hours than an innocent person being incorrectly reported to any government authority.
Great writeup! I tried searching but came up short, do you have a link to the technical documentation?
It’s on Apple’s website. Here’s the link: https://www.apple.com/child-safety/pdf/CSAM_Detection_Technical_Summary.pdf
Ty <3
From a technical perspective, how much would an image need to be changed before the hash no longer matched? I’ve heard of people including junk .txt files in repacked and zipped pirated games, movies, etc., so that they aren’t automatically flagged for removal from file sharing sites.
I am not a technical expert by any means, and I don’t even use Apple products, so this is just curiosity.
It depends on the type of hash. For the type of hashing used by checksums, a single byte is enough, because they’re cryptographic hashes, and the intent is to identify whether files are exact matches.
However, the type of hashing used for CSAM is called a semantic hash. The intent of this type of hash is that similar content results in a similar (or identical) output. I can’t walk you through exactly how the hash is done, but it is designed specifically so that minor alterations do not prevent identification.
If, for instance, I was pirating a video game, would packing it in an encrypted container along with a Gb or two of downloaded YouTube videos be sufficient to defeat semantic hashing? What about taking that encrypted volume and spanning it across multiple files?
Encrypting it should be enough to defeat either hash.
Without encryption I think it would depend on implementation. I’m not aware of the specific limitations of the tools they use, but it’s for photo/video and shouldn’t really meaningfully generalize to other formats.
That’s a good question. First it’s important to understand that hash functions for pirated games or other programs are actually different from hash functions used to detect media like pictures, movies, and sound recordings.
If you alter a piece of code or text from the original version the hashes will no longer match, but typically those hashes should match and some kind of alarm gets tripped if they don’t.
With media files like music, movies, or pictures, it works the other way around. Detection tools are looking for something that is not necessarily an exact match, but a very close match, and when such a match is found, alarms get tripped (because it’s CSAM, or a copyright violation, or something like that).
As to the techniques you mentioned for concealing a pirated game in a ZIP file with a bunch of junk TXT files, that’s not going to work. The reason it doesn’t work is that if you ZIP something, all that uses compression algorithms that change the contents of the ZIP file in predictable repeating patterns. It’s easy to detect and compensate for. Now, if you use your ZIP/compression tool to actually encrypt the file with a good algorithm and a strong password, that’s different, but then you don’t need to pack it with junk. (And distributing the password securely will be a problem.)
Please, people who know more about hashing and media detection with hashing, let me know if I got something wrong, I probably did.
Haha! Thanks for the excellent write up. Yes, I recall Apple handling CSAM this way and went out of it’s way to try and convince users it was still a good idea, but still faced a lot of criticism for it.
I doubt this bill will be as thorough which is why I was posing the question I asked. Apple could technically comply using some of the work it did but it’s sort of moot if things are end to end encrypted.
It would have worked and it would have protected privacy but most people don’t understand the difference between having a hash of known CSAM on your phone and having actual CSAM on your phone for comparison purposes and it freaked people out.
I understand the difference and I’m still uncomfortable with it, not because of the proximity to CSAM but because I don’t like the precedent of anyone scanning my encrypted messages. Give them an inch, etc.
I’m assuming it’s either apple not wanting to be told to do it, or it’s due to them “learning their lesson” and no longer support it, they seem to be leaning quite heavily into privacy
Apple wants money for spying on their users, this bill would compel them to do that without the secret money their getting now, so their against it
Nah, Apple is one of the few companies around that is big on privacy and uses privacy as a differentiator for it’s products. Look at some of the other responses, it’s more complex than them just wanting money. They already make a boat load of it.
To me, this seems like such a transparent attempt to force the tech companies to have a backdoor. If they can scan for CSAM, they can scan (or copy) anything else the government wants.
That’s very likely the actual goal. Stopping child abuse is only an excuse, one governments keep pulling out whenever they want to push anti privacy legislation. And it’s clear that this would do nothing to stop it either, because then abusers just wouldn’t use compromised services from big companies.
Well yeah, even if they aren’t good at it and are of hypocritical about it, appearing to believe the “what happens on iPhone stays on iPhone” philosophy is important to them.
I wouldn’t say they’re hypocritical. I was in complete shock that they actually scrapped their iPhone scanning plans and now offer E2E for most of iCloud. They aren’t perfect but they definitely are better than most companies
Yeaaaaah maybe read up on some of the E2E stuff. Someone else at the top of this post posted a link to how it works.
Yeah but his point stands. Here’s the summary:
- If you’re syncing iMessages via iCloud but don’t use iCloud backup without Advanced Data Protection, it’s E2E
- If you have iCloud backup enabled without Advanced Data Protection enabled, iMessage isn’t E2E encrypted
- If you have Advanced Data Protection turned on for iCloud and you’re using iCloud backup, iMessage is E2E encrypted however you look at it.
It’s generally good practice to not use iCloud backups but rather back it up yourself, however, most people don’t care enough.
Cool. Now explain that to the average user.
It’s generally a question about what’s best for the user, your general user would likely be more mad losing all their messages because they forgot their password then they are calmed by the fact that no one else can read the data. Same for photos and files, however for sensitive categories such as health and passwords they are always end to end encrypted as it’s determined it’s worse for anyone else to get that data then it is for the user to lose it.
For anyone that truly cares to have complete encryption there is advanced data protection but for the general users the defaults are a good balance between security and ease of use.
Yeah, all of that is true. But people who buy iPhones and assume, because the marketing said so, that they’re perfectly secure are worryingly ignorant.
Man, I was hoping by moving away from Reddit I could move away from the pure hate apple for whatever reason. Show me how the other mobile OS is making things any better?
I use an iPhone 12. I’m not going to defend Android because I don’t use it. I’m just not under the illusion of whatever Apple marketing distills complex problems down to, for better or worse, and being disillusioned isn’t “hate”, it’s awareness. Hate is something I reserve for my mother and father. This is just a goddamn phone.
Moreover, being less bad than the other guy doesn’t make you not bad. Your whataboutism is weak tea.
Except “less bad” is better than bad when you have 2 choices. I’d love to know where apple had blown it on their privacy record. And don’t try to bring up the CSAM shit because they walked back on that when they realized their userbase didn’t want it.
Top of this post. Someone posted a link to Apple’s website explaining the stipulation regarding iMessage and iCloud E2E.
Stop being so defensive.
Good
Thanks Apple
deleted by creator
Isn’t e2ee messaging intended to encrypt data transfer between devices, not provide global security? The default iCloud backups are still encrypted, but the key is stored/recoverable by Apple. This is the ideal sort of encryption for 99%* of the population. For the 1%, an option to forego the Apple stored key is an option.
* yes, I made that number up. I will stand by it.
If you did a random survey of users if they want a secure backup, e2ee encrypted, they will say yes. Overwhelmingly. But if you ask the question based on the outcome of a forgotten password: “If you lose or damage your phone and have forgotten your Apple account password, would you like all of your iCloud photos, messages, emails, contacts, and documents to be instantly destroyed and unrecoverable or should Apple be able to restore everything if you can prove it’s your account?” they will almost certainly choose the latter.
If a user wants more protections there is “Advanced Data Protection” which fully encrypts all iCloud data however Apple knows you might lose your password or something so they require a recovery method before turning it on and make sure you know Apple won’t be able to help you if you lose your password and recovery method.
Also for certain sensitive data such as health data or passwords full end to end encryption is enabled even in standard mode as it’s determined it’s worse for someone else to get access to that data then it is for you to lose it where as generally losing your photos are worse then someone else getting access to them.
deleted by creator
Are the icloud messages stored in plain text?
No, they’re encrypted. But Apple stores a copy of your key because most people forget their Apple password at some point (usually after they’ve wiped their phone and are setting up a new one) and need Apple to reset their password/re-enable their encryption key on the new device.
Huh the more you know…
Also know that if you still want iCloud backups but want everything stored encrypted you can enable “Advanced data protection” which means that Apple doesn’t store the encryption key, you do need to setup a recovery method such as a recovery key or recovery contact however if you lose your device and recovery method your data is forever lost and Apple can’t help you like it can in standard data protection mode.
Also note certain sensitive categories such a health and passwords are always encrypted as it’s determined it’s worse for someone else to get access to that data then it is for the user to lose it meanwhile generally a user losing their photos and messages if they forget their password is worse then if a hacker resets the password and gets access.
deleted by creator
At this point trying to kill encrypted messaging is equivalent to banning meetups to stop criminals to meet up.
I think law enforcement should be able to intercept messages on services like WhatsApp, if someone is suspected of criminal activity.
Is it right for criminals to be able to share child abuse material, or plans for terrorism, over something like WhatsApp? Without law enforcement being able to intercept these messages?
I think law enforcement can break into your home if they have a court warrant, right? So why not allow the same thing with electronic communications?
It’s simple.
If it’s possible for WhatsApp to intercept the communications of “bad people” for law enforcement, it’s fundamentally impossible for any communication to be private. The existence of a back door is automatically a gaping security flaw.
There’s no such thing as “securely intercepting” messages. Either they’re secure against all actors or they’re not secure.
Maybe it’s worth having that security hole then. I think it’s a bit crazy that terrorists or child abusers can plan their crimes using WhatsApp without the police being able to intercept their messages.
Also, if we’re able to contact our banks over the internet securely (and obviously the bank can still see everything about our accounts if they want, while criminals hopefully won’t be able to), then surely an equivalent should be possible for things like WhatsApp.
Maybe it’s worth having that security hole then. I think it’s a bit crazy that terrorists or child abusers can plan their crimes using WhatsApp without the police being able to intercept their messages.
Encryption exists. Terrorists and child abusers will use it whether WhatsApp or Apple or whoever implement it or not. Stopping those implementations is just denying privacy to regular users.
Also, if we’re able to contact our banks over the internet securely (and obviously the bank can still see everything about our accounts if they want, while criminals hopefully won’t be able to), then surely an equivalent should be possible for things like WhatsApp.
Law enforcement can’t eavesdrop on your encrypted connection to your bank. If they need to know about your banking activity, they rely on the bank reporting it to them.
Ok so basic question you should be able to answer then how do you stop a foreign government from spying on other countries citizens? WhatsApp is not just a western world app. For example it’s used in Russia and the US and the UK so if Putin went to Meta and said “I want everything you have on Ex prime minister of the United Kingdom Boris Johnson and you can’t tell them” what reason would meta have to deny his request if the precedent by the UK is that this data needs to have a back door and if you say then the user should be notified then anyone under investigation is just not going to say anything incriminating and if it includes old messages then you risk the political espionage if anything is shared under the assumption everything is end to end encrypted. What about trade secrets, a corrupt government official could get a companies trade secrets for a business friend from anywhere in the world.
There is a great video by Tom Scott that talks about this exact situation when the UK tried to break encryption 5 years ago but that failed because it wasn’t feasible from a security standpoint. There is also a great episode from Last Week Tonight talking about encryption and government attempts to get around it. We’ve seen from things like the Pegasus malware that repressive governments will use this little break in encryption to jail protestors and journalists and spy on their political rivals, having an official way will just make it easier.
I think law enforcement can break into your home if they have a court warrant, right? So why not allow the same thing with electronic communications?
For me, the reason to disallow it is the potential for abuse. There were 864 search warrant applications across all federal agencies in 2022. In 2020, the FBI, specifically, issued 11504 warrants to Google, specifically, for geofencing data, specifically. Across all agencies there are probably millions of such “warrants” for data.
It’s far easier to access your data than your house, so comparing physical and cybersecurity doesn’t really make sense.
In general, criminals can easily just move to an uncompromised platform to do illegal stuff. But giving the govt easy access to messaging data allows for all kinds of dystopic suppression for regular people.
Generally tech companies now have agreements with law enforcement so they don’t have to deal with all the legal mumbo jumbo. Some data does still require a warrant such as if there is any protection laws(such as HIPAA protected data) or if the company considers it highly sensitive data but for a lot of data it’s easier to just hand it over then get legal involved.
Here are 10 reasons why privacy is important https://www.humanrightscareers.com/issues/reasons-why-privacy-rights-are-important/