A controversial European Union legislative proposal to scan the private messages of citizens in a bid to detect child sexual abuse material (CSAM) is a risk to the future of web security, Meredith Whittaker warned in a public blog post Monday. She’s the president of the not-for-profit foundation behind the end-to-end encrypted (E2EE) messaging app Signal.

“There is no way to implement such proposals in the context of end-to-end encrypted communications without fundamentally undermining encryption and creating a dangerous vulnerability in core infrastructure that would have global implications well beyond Europe,” she wrote.

The most recent European Council proposal, which was put forward in May under the Belgian presidency, includes a requirement that “providers of interpersonal communications services” (aka messaging apps) install and operate what the draft text describes as “technologies for upload moderation”, per a text published by Netzpolitik.

Last month, Euractiv reported that the revised proposal would require users of E2EE messaging apps to consent to scanning to detect CSAM. Users who did not consent would be prevented from using features that involve the sending of visual content or URLs it also reported — essentially downgrading their messaging experience to basic text and audio.

The EU’s own data protection supervisor has also voiced concern. Last year, it warned that the plan poses a direct threat to democratic values in a free and open society.

Pressure on governments to force E2EE apps to scan private messages, meanwhile, is likely coming from law enforcement.

Back in April European police chiefs put out a joint statement calling for platforms to design security systems in such a way that they can still identify illegal activity and send reports on message content to law enforcement. Their call for “technical solutions” to ensure “lawful access” to encrypted data did not specify how platforms should achieve this sleight of hand

  • sunzu
    link
    fedilink
    96 months ago

    They know this… they don’t want normies catching on because that’s when they lose control.

    • @Serinus
      link
      36 months ago

      Are you saying the point isn’t to catch CSAM because 95% of the CSAM people will find a workaround within a week?

      Then what would be the point?

      • sunzu
        link
        fedilink
        5
        edit-2
        6 months ago

        If they won’t do jack shit about childre. abused by Catholic church… Other religions degenerates… Police and teacher…

        If they won’t bother cleaning up twitter insta and discord from child abuse content…

        I am sure they will finally fix the issue by scanning dicpiks I sent my partner. Pinky promise bro