• @ilickfrogs
    cake
    link
    441 year ago

    E2E encrypted communication can be used for nefarious things, that’s a fact. But it’s something that needs to be standardized because the accessibility of any and all private communication or information to so few individuals can be used so much more nefariously. Really wish people were more concerned about data privacy. It’s not about how your data will be use against you… It’s about how OUR data is used against US.

  • @Dark_Blade
    cake
    link
    251 year ago

    Good. Despite all the mistakes they make, at least Apple seems to be willing to learn from some of ‘em and stand up for their users (even if only a little).

    • @shinjiikarus
      link
      English
      161 year ago

      I actually don’t think this has anything to do with standing up for their users but is a simple cost/benefit analysis: building compromised E2E-communication that is still reasonably secure against bad actors is much more difficult (if not impossible) than building robust E2E-communication. Apple just doesn’t want to lose business users over headlines like „iOS messaging used by Chinese spies to steal US trade secrets“, while headlines about how difficult it is for government agencies to unlock iPhones probably drive sales. Nothing morally or ethical here, only profits.

      • @Dark_Blade
        cake
        link
        121 year ago

        I mean, it’s still standing up for their users even though it’s profit-driven.

        • @heirloomvegtattoo
          link
          41 year ago

          They just launched a whole ad campaign based around imessages encryption as well… not supporting would be a bad look and a waste of ad dollars

          • @Dark_Blade
            cake
            link
            21 year ago

            lol agreed, plus the whole CSAM mess that they can help bury with this.

        • @3rdBlueWizard
          link
          21 year ago

          Yep. Honestly, if there’s a good profit motive to do the right thing, I trust companies far more to actually do the right thing. We WANT there to be profit in doing the right thing. When there isn’t, they don’t.

          • @Dark_Blade
            cake
            link
            11 year ago

            Exactly. Greed is far more predictable and transparent than morality.

  • @generalpotato
    link
    111 year ago

    Didn’t Apple try to introduce this and got a ton of flak from all sorts of privacy “experts”? They then scrapped their plans, did they not? How is this any better/different? Any sort of “backdoor” into encryption means that the encryption is compromised. They tackled this in 2014 in the US. Feels like deja vu all over again.

    • AlexKingstonsGigolo
      link
      fedilink
      121 year ago

      @generalpotato Ish. I read the technical write up and they actually came up with a very clever privacy-focused way of scanning for child porn.

      First, only photos were scanned and only if they were stored in iCloud.

      Then, only cryptographic hashes of the photos were collected.

      Those hashes were grepped for other cryptographic hashes of known child porn images, images which had to be in databases of multiple non-governmental organizations; so, if an image was only in the database of, say, the National Center For Missing And Exploited Children or only in the database of China’s equivalent, its cryptographic hash couldn’t be used. This requirement would make it harder for a dictator to slip in a hash to look for dissidents by making it substantially more difficult to get an image in enough databases.

      Even then, an Apple employee would have to verify actual child porn was being stored in iCloud only after 20 separate images were flagged. (The odds any innocent person even makes it to this stage incorrectly was estimated to be something like one false positive a year, I think, because of all of the safeguards Apple had.)

      Only after an Apple employee confirmed the existence of child porn would the iCloud account be frozen and the relevant non-government organizations alerted.

      Honestly, I have a better chance of getting a handjob from Natalie Portman in the next 24 hours than an innocent person being incorrectly reported to any government authority.

      • ansik
        link
        fedilink
        31 year ago

        Great writeup! I tried searching but came up short, do you have a link to the technical documentation?

      • HelixDab
        link
        fedilink
        31 year ago

        From a technical perspective, how much would an image need to be changed before the hash no longer matched? I’ve heard of people including junk .txt files in repacked and zipped pirated games, movies, etc., so that they aren’t automatically flagged for removal from file sharing sites.

        I am not a technical expert by any means, and I don’t even use Apple products, so this is just curiosity.

        • conciselyverbose
          link
          fedilink
          31 year ago

          It depends on the type of hash. For the type of hashing used by checksums, a single byte is enough, because they’re cryptographic hashes, and the intent is to identify whether files are exact matches.

          However, the type of hashing used for CSAM is called a semantic hash. The intent of this type of hash is that similar content results in a similar (or identical) output. I can’t walk you through exactly how the hash is done, but it is designed specifically so that minor alterations do not prevent identification.

          • HelixDab
            link
            fedilink
            11 year ago

            If, for instance, I was pirating a video game, would packing it in an encrypted container along with a Gb or two of downloaded YouTube videos be sufficient to defeat semantic hashing? What about taking that encrypted volume and spanning it across multiple files?

            • conciselyverbose
              link
              fedilink
              11 year ago

              Encrypting it should be enough to defeat either hash.

              Without encryption I think it would depend on implementation. I’m not aware of the specific limitations of the tools they use, but it’s for photo/video and shouldn’t really meaningfully generalize to other formats.

        • MisuseCase
          link
          fedilink
          11 year ago

          That’s a good question. First it’s important to understand that hash functions for pirated games or other programs are actually different from hash functions used to detect media like pictures, movies, and sound recordings.

          If you alter a piece of code or text from the original version the hashes will no longer match, but typically those hashes should match and some kind of alarm gets tripped if they don’t.

          With media files like music, movies, or pictures, it works the other way around. Detection tools are looking for something that is not necessarily an exact match, but a very close match, and when such a match is found, alarms get tripped (because it’s CSAM, or a copyright violation, or something like that).

          As to the techniques you mentioned for concealing a pirated game in a ZIP file with a bunch of junk TXT files, that’s not going to work. The reason it doesn’t work is that if you ZIP something, all that uses compression algorithms that change the contents of the ZIP file in predictable repeating patterns. It’s easy to detect and compensate for. Now, if you use your ZIP/compression tool to actually encrypt the file with a good algorithm and a strong password, that’s different, but then you don’t need to pack it with junk. (And distributing the password securely will be a problem.)

          Please, people who know more about hashing and media detection with hashing, let me know if I got something wrong, I probably did.

      • @generalpotato
        link
        31 year ago

        Haha! Thanks for the excellent write up. Yes, I recall Apple handling CSAM this way and went out of it’s way to try and convince users it was still a good idea, but still faced a lot of criticism for it.

        I doubt this bill will be as thorough which is why I was posing the question I asked. Apple could technically comply using some of the work it did but it’s sort of moot if things are end to end encrypted.

      • MisuseCase
        link
        fedilink
        21 year ago

        It would have worked and it would have protected privacy but most people don’t understand the difference between having a hash of known CSAM on your phone and having actual CSAM on your phone for comparison purposes and it freaked people out.

        I understand the difference and I’m still uncomfortable with it, not because of the proximity to CSAM but because I don’t like the precedent of anyone scanning my encrypted messages. Give them an inch, etc.

    • Mitchacho74
      link
      71 year ago

      I’m assuming it’s either apple not wanting to be told to do it, or it’s due to them “learning their lesson” and no longer support it, they seem to be leaning quite heavily into privacy

    • thann
      link
      fedilink
      -21 year ago

      Apple wants money for spying on their users, this bill would compel them to do that without the secret money their getting now, so their against it

      • @generalpotato
        link
        31 year ago

        Nah, Apple is one of the few companies around that is big on privacy and uses privacy as a differentiator for it’s products. Look at some of the other responses, it’s more complex than them just wanting money. They already make a boat load of it.

  • RandomBit
    link
    fedilink
    61 year ago

    To me, this seems like such a transparent attempt to force the tech companies to have a backdoor. If they can scan for CSAM, they can scan (or copy) anything else the government wants.

    • 2xsaiko
      link
      fedilink
      English
      21 year ago

      That’s very likely the actual goal. Stopping child abuse is only an excuse, one governments keep pulling out whenever they want to push anti privacy legislation. And it’s clear that this would do nothing to stop it either, because then abusers just wouldn’t use compromised services from big companies.

  • sebinspace
    link
    51 year ago

    Well yeah, even if they aren’t good at it and are of hypocritical about it, appearing to believe the “what happens on iPhone stays on iPhone” philosophy is important to them.

    • @Dick
      link
      151 year ago

      I wouldn’t say they’re hypocritical. I was in complete shock that they actually scrapped their iPhone scanning plans and now offer E2E for most of iCloud. They aren’t perfect but they definitely are better than most companies

      • sebinspace
        link
        21 year ago

        Yeaaaaah maybe read up on some of the E2E stuff. Someone else at the top of this post posted a link to how it works.

        • @asbestos
          link
          61 year ago

          Yeah but his point stands. Here’s the summary:

          • If you’re syncing iMessages via iCloud but don’t use iCloud backup without Advanced Data Protection, it’s E2E
          • If you have iCloud backup enabled without Advanced Data Protection enabled, iMessage isn’t E2E encrypted
          • If you have Advanced Data Protection turned on for iCloud and you’re using iCloud backup, iMessage is E2E encrypted however you look at it.

          It’s generally good practice to not use iCloud backups but rather back it up yourself, however, most people don’t care enough.

          • sebinspace
            link
            01 year ago

            Cool. Now explain that to the average user.

        • @Marcy_Stella
          link
          21 year ago

          It’s generally a question about what’s best for the user, your general user would likely be more mad losing all their messages because they forgot their password then they are calmed by the fact that no one else can read the data. Same for photos and files, however for sensitive categories such as health and passwords they are always end to end encrypted as it’s determined it’s worse for anyone else to get that data then it is for the user to lose it.

          For anyone that truly cares to have complete encryption there is advanced data protection but for the general users the defaults are a good balance between security and ease of use.

          • sebinspace
            link
            11 year ago

            Yeah, all of that is true. But people who buy iPhones and assume, because the marketing said so, that they’re perfectly secure are worryingly ignorant.

    • @ScoobyDoo27
      link
      21 year ago

      Man, I was hoping by moving away from Reddit I could move away from the pure hate apple for whatever reason. Show me how the other mobile OS is making things any better?

      • sebinspace
        link
        1
        edit-2
        1 year ago

        I use an iPhone 12. I’m not going to defend Android because I don’t use it. I’m just not under the illusion of whatever Apple marketing distills complex problems down to, for better or worse, and being disillusioned isn’t “hate”, it’s awareness. Hate is something I reserve for my mother and father. This is just a goddamn phone.

        Moreover, being less bad than the other guy doesn’t make you not bad. Your whataboutism is weak tea.

        • @ScoobyDoo27
          link
          21 year ago

          Except “less bad” is better than bad when you have 2 choices. I’d love to know where apple had blown it on their privacy record. And don’t try to bring up the CSAM shit because they walked back on that when they realized their userbase didn’t want it.

          • sebinspace
            link
            11 year ago

            Top of this post. Someone posted a link to Apple’s website explaining the stipulation regarding iMessage and iCloud E2E.

            Stop being so defensive.

    • Overzeetop
      link
      1
      edit-2
      1 year ago

      Isn’t e2ee messaging intended to encrypt data transfer between devices, not provide global security? The default iCloud backups are still encrypted, but the key is stored/recoverable by Apple. This is the ideal sort of encryption for 99%* of the population. For the 1%, an option to forego the Apple stored key is an option.

      * yes, I made that number up. I will stand by it.

      If you did a random survey of users if they want a secure backup, e2ee encrypted, they will say yes. Overwhelmingly. But if you ask the question based on the outcome of a forgotten password: “If you lose or damage your phone and have forgotten your Apple account password, would you like all of your iCloud photos, messages, emails, contacts, and documents to be instantly destroyed and unrecoverable or should Apple be able to restore everything if you can prove it’s your account?” they will almost certainly choose the latter.

      • @Marcy_Stella
        link
        21 year ago

        If a user wants more protections there is “Advanced Data Protection” which fully encrypts all iCloud data however Apple knows you might lose your password or something so they require a recovery method before turning it on and make sure you know Apple won’t be able to help you if you lose your password and recovery method.

        Also for certain sensitive data such as health data or passwords full end to end encryption is enabled even in standard mode as it’s determined it’s worse for someone else to get access to that data then it is for you to lose it where as generally losing your photos are worse then someone else getting access to them.

    • Phil
      link
      11 year ago

      Are the icloud messages stored in plain text?

      • Overzeetop
        link
        11 year ago

        No, they’re encrypted. But Apple stores a copy of your key because most people forget their Apple password at some point (usually after they’ve wiped their phone and are setting up a new one) and need Apple to reset their password/re-enable their encryption key on the new device.

        • Phil
          link
          11 year ago

          Huh the more you know…

          • @Marcy_Stella
            link
            11 year ago

            Also know that if you still want iCloud backups but want everything stored encrypted you can enable “Advanced data protection” which means that Apple doesn’t store the encryption key, you do need to setup a recovery method such as a recovery key or recovery contact however if you lose your device and recovery method your data is forever lost and Apple can’t help you like it can in standard data protection mode.

            Also note certain sensitive categories such a health and passwords are always encrypted as it’s determined it’s worse for someone else to get access to that data then it is for the user to lose it meanwhile generally a user losing their photos and messages if they forget their password is worse then if a hacker resets the password and gets access.

  • @Willer
    link
    11 year ago

    At this point trying to kill encrypted messaging is equivalent to banning meetups to stop criminals to meet up.

  • Untitled9999
    link
    fedilink
    -71 year ago

    I think law enforcement should be able to intercept messages on services like WhatsApp, if someone is suspected of criminal activity.

    Is it right for criminals to be able to share child abuse material, or plans for terrorism, over something like WhatsApp? Without law enforcement being able to intercept these messages?

    I think law enforcement can break into your home if they have a court warrant, right? So why not allow the same thing with electronic communications?

    • conciselyverbose
      link
      fedilink
      61 year ago

      It’s simple.

      If it’s possible for WhatsApp to intercept the communications of “bad people” for law enforcement, it’s fundamentally impossible for any communication to be private. The existence of a back door is automatically a gaping security flaw.

      There’s no such thing as “securely intercepting” messages. Either they’re secure against all actors or they’re not secure.

      • Untitled9999
        link
        fedilink
        -21 year ago

        Maybe it’s worth having that security hole then. I think it’s a bit crazy that terrorists or child abusers can plan their crimes using WhatsApp without the police being able to intercept their messages.

        Also, if we’re able to contact our banks over the internet securely (and obviously the bank can still see everything about our accounts if they want, while criminals hopefully won’t be able to), then surely an equivalent should be possible for things like WhatsApp.

        • cacheson
          link
          fedilink
          41 year ago

          Maybe it’s worth having that security hole then. I think it’s a bit crazy that terrorists or child abusers can plan their crimes using WhatsApp without the police being able to intercept their messages.

          Encryption exists. Terrorists and child abusers will use it whether WhatsApp or Apple or whoever implement it or not. Stopping those implementations is just denying privacy to regular users.

          Also, if we’re able to contact our banks over the internet securely (and obviously the bank can still see everything about our accounts if they want, while criminals hopefully won’t be able to), then surely an equivalent should be possible for things like WhatsApp.

          Law enforcement can’t eavesdrop on your encrypted connection to your bank. If they need to know about your banking activity, they rely on the bank reporting it to them.

        • @Marcy_Stella
          link
          21 year ago

          Ok so basic question you should be able to answer then how do you stop a foreign government from spying on other countries citizens? WhatsApp is not just a western world app. For example it’s used in Russia and the US and the UK so if Putin went to Meta and said “I want everything you have on Ex prime minister of the United Kingdom Boris Johnson and you can’t tell them” what reason would meta have to deny his request if the precedent by the UK is that this data needs to have a back door and if you say then the user should be notified then anyone under investigation is just not going to say anything incriminating and if it includes old messages then you risk the political espionage if anything is shared under the assumption everything is end to end encrypted. What about trade secrets, a corrupt government official could get a companies trade secrets for a business friend from anywhere in the world.

          There is a great video by Tom Scott that talks about this exact situation when the UK tried to break encryption 5 years ago but that failed because it wasn’t feasible from a security standpoint. There is also a great episode from Last Week Tonight talking about encryption and government attempts to get around it. We’ve seen from things like the Pegasus malware that repressive governments will use this little break in encryption to jail protestors and journalists and spy on their political rivals, having an official way will just make it easier.

    • @iceonfire1
      link
      11 year ago

      I think law enforcement can break into your home if they have a court warrant, right? So why not allow the same thing with electronic communications?

      For me, the reason to disallow it is the potential for abuse. There were 864 search warrant applications across all federal agencies in 2022. In 2020, the FBI, specifically, issued 11504 warrants to Google, specifically, for geofencing data, specifically. Across all agencies there are probably millions of such “warrants” for data.

      It’s far easier to access your data than your house, so comparing physical and cybersecurity doesn’t really make sense.

      In general, criminals can easily just move to an uncompromised platform to do illegal stuff. But giving the govt easy access to messaging data allows for all kinds of dystopic suppression for regular people.

      • @Marcy_Stella
        link
        11 year ago

        Generally tech companies now have agreements with law enforcement so they don’t have to deal with all the legal mumbo jumbo. Some data does still require a warrant such as if there is any protection laws(such as HIPAA protected data) or if the company considers it highly sensitive data but for a lot of data it’s easier to just hand it over then get legal involved.