• southsamurai
    link
    fedilink
    English
    1314 months ago

    Frankly, good.

    There has yet to be any of these purported “child protection” scams that would do a damn thing for kids, and only invades the privacy of people that have zero reason to be investigated in the first place

    • @PumaStoleMyBluff
      link
      English
      24 months ago

      They could at least do on-device hash lookups and prevent sending. Has zero effect on privacy and does reduce CSAM.

      • southsamurai
        link
        fedilink
        English
        104 months ago

        Yah, that would be a great solution in comparison, but it’s still privacy invasive. Not as bad, but it’s still not giving people due process.

        Which, not everywhere in the world recognizes that principle as a right, I am aware. But I do consider due process a right, and scanning anything on anyone’s devices without a legally justifiable reason is a violation of that.

        I’m not willing to kowtow to a moral panic and just ignore the erosion of privacy “because the children”. And it is a moral panic. As bad as it is, as much as I personally would enjoy five minutes alone with someone that’s making or using kiddie porn of any stripe, it simply isn’t such a common thing that stripping everyone of their privacy, in any way is acceptable.

        They wanna figure out a way to target individuals suspected of that kind of crime, awesome. Untargeted, sweeping invasions simply are not acceptable, and I do not care what the purported reason of the week is; kiddie porn, terrorism, security, stopping drugs, I do not care. I have committed no crime, and refuse to give away the presumption of innocence for myself or anyone else.

    • katy ✨
      link
      fedilink
      English
      -114 months ago

      yeah cracking down on the child trafficking networks operating on telegram would totally not do a thing /s

      • southsamurai
        link
        fedilink
        English
        134 months ago

        It wouldn’t. Anyone into that shit will just go somewhere else, and the price of that is yet another erosion of privacy.

        • katy ✨
          link
          fedilink
          English
          -184 months ago

          yes if you go after people sharing it on the network you make it harder for them to access it. stop defending csam it’s creepy.

          • southsamurai
            link
            fedilink
            English
            17
            edit-2
            4 months ago

            Look, just because people don’t agree that a specific method will be effective, that doesn’t mean they support it.

            That’s shitty thinking, and even shittier behavior. You should be ashamed of yourself for going there in what was previously a civil, friendly discussion.

            • @Agrivar
              link
              English
              -154 months ago

              Removed by mod

              • southsamurai
                link
                fedilink
                English
                74 months ago

                Gods, people are just dumb.

                I am so glad lemmy has a block function

                • @Cryophilia
                  link
                  English
                  24 months ago

                  If you block them, it amplifies them. Now you can no longer call them out.

  • @[email protected]
    link
    fedilink
    English
    1144 months ago

    If the programs were anything like this, I don’t blame them. There’s a fine line between child protection and surveillance.

    • Lucy :3
      link
      fedilink
      English
      34 months ago

      Hmm. I think many services just don’t and can’t participate because they’d need to break E2EE. Telegram wouldn’t with most chats.

  • @Mrkawfee
    link
    English
    110
    edit-2
    4 months ago

    When the West wants to censor the internet its always either child protection or national security.thats brought up as the reason.

    • @[email protected]
      link
      fedilink
      English
      104 months ago

      The west

      Are authoritarian regimes somehow supposed to be more opposed to using children to promote heightened surveillance?

      • @[email protected]
        link
        fedilink
        English
        124 months ago

        I mean… Yes?

        They don’t need to lie to sell their oppression. They just do it because they’re authoritarian.

        • @asdfasdfasdf
          link
          English
          04 months ago

          LOL, this is a joke right? Authoritarian countries don’t lie about reasons for doing things? LMAO

          • @pressanykeynow
            link
            English
            34 months ago

            Well when Russia blocked Telegram it was just because Telegram refused to send them their users data. Simple. Now France seem to be sentencing a person to live in prison because Telegram still refuses to send them their users data. But they claim it’s for the children. Same shit, different excuse.

            • @asdfasdfasdf
              link
              English
              14 months ago

              Sure, but what the person I replied to is claiming is that e.g. North Korea doesn’t lie to its people about reasons it does things, which is, of course, bullshit.

              • @pressanykeynow
                link
                English
                -14 months ago

                They didn’t claim it though.

                They said that all governments do some terrible things but in case of the governments that claim they are not authoritarian they pretext those things with something that will make the public not think they are doing a terrible thing.

                In case of restricting Internet freedom or invading other countries it’s usually “but think of children”.

                • @asdfasdfasdf
                  link
                  English
                  24 months ago

                  They claimed that authoritarian governments do not do this since they have no reason.

      • @ZILtoid1991
        link
        English
        94 months ago

        Authoritarian regimes also do the same, although often with adult consentual porn instead of CSAM.

        • @yamanii
          link
          English
          24 months ago

          The authoritarian South Korea.

          • @[email protected]
            link
            fedilink
            English
            114 months ago

            Ayo. The country that has

            • a stifling work culture
            • zero tolerance porn laws
            • full blown internet censorship
            • chaebols
            • harsh punishment on even the softest of “drugs”
            • miniscule support for new families

            is definitely not authoritarian.

            It’s okay. With the childbirth rate they have currently, they don’t have to worry for much longer. Let’s just squeeze the last out of the current generations.

      • @Doorbook
        link
        English
        34 months ago

        Authoritian regimes doesnt need to pretend. If they find out you are a risk they don’t need to gather evidence to get you in prison, so they don’t need to pretend they care about censoring the internet for the wrong reasons.

        The issue here is the west want to do the same but need a valid justification. Instead of work to stop the actual abuse in the first place they want access to the only way for many people to share information safely.

        You could be technically letrate and find your way around all the restrictions, but many people are not and they need access to secure communication channels to arrange there activism.

        The fact we don’t see backlash against twitter, Facebook, Google, and Apple tells alot about what is this about.

        The fact we are seeing more support for “consent” for kids, and the fact that there were many major cases such as Epstein and Maxwell which has been obscured or even hidden when it comes to major profilic people says alot about their intent.

        • @rottingleaf
          link
          English
          -24 months ago

          They do need to pretend, because they need assistance from supposedly civilized states in their actions covered by that pretense.

  • @[email protected]
    link
    fedilink
    English
    76
    edit-2
    4 months ago

    Thank you for choosing “Tyranny as a Service!”

    How would you like this wrapped? [ ] Terrorism [X] Child porn

  • @[email protected]
    link
    fedilink
    English
    584 months ago

    If they refused to hand over data that they had about individuals on a warrant, I can see how the arrest was kind of justified.

    If the arrest was for refusing to install a backdoor for law enforcement to spy on anyone they want, then France needs to be kicked out of EU and sanctioned for human rights violations.

  • @ikidd
    link
    English
    344 months ago

    The manufacturing consent system seems to be in full swing on this one.

    • @rottingleaf
      link
      English
      34 months ago

      If there were a good alternative in the sense of public channels that don’t usually get banned, my consent they would get even earlier.

      But the issue is - I don’t even know where to go to discuss shit. Despite TG being full of government trolls.

      • @pressanykeynow
        link
        English
        04 months ago

        Despite TG being full of government trolls.

        That’s the world we live in now. If it’s popular it will be full of trolls, government, corporate, all kind. Just somewhat popular, they are here on Lemmy too.

  • @[email protected]
    link
    fedilink
    English
    264 months ago

    Did religions joined child protection schemes? Because they are one of the biggest child indoctrination and abuse schemes in the world.

  • @[email protected]
    link
    fedilink
    English
    144 months ago

    Ruling class has been waging war on social media they dont have the ability to backdoor. My guess is they’d come for signal too if they didn’t use it themselves.

    • @pressanykeynow
      link
      English
      44 months ago

      The government don’t usually need the text from your conversations, just the metadata who the person talks to, their location, etc. Signal is a US company, they surely provide all that data. It seems Telegram didn’t.

      • @Lowpast
        link
        English
        24 months ago

        Signal does not. https://signal.org/bigbrother/santa-clara-county/

        Tl;dr: Signal gave the court timestamps for three out of nine phone numbers that the court demanded data on. The timestamps were the dates three phone numbers last registered their accounts with Signal. That’s it. That is all the data there was to give.

        This is why I use Signal. This is why I donate monthly to Signal.

  • @sumguyonline
    link
    English
    74 months ago

    “Schemes” it’s as if they know they aren’t actually protecting anyone… Like they would just let anyone torment their children if they claimed religious protections and offered a big enough bribe(I know for a fact that is how it actually works). But sure, telegram is the problem not fed bastards hunting innocent people because the bad people bribed them to leave them alone. Be a 1% or be investigated when you don’t cow toe to the 1%. Your choice apparently.

  • @[email protected]
    link
    fedilink
    English
    64 months ago

    Did the child protection schemes involve compromising the security of communications on the app?

  • @[email protected]
    link
    fedilink
    English
    24 months ago

    The BBC contacted Telegram for comment about its refusal to join the child protection schemes and received a response after publication which has been included.

    Where is it? I didn’t find it anywhere in the article.

    • @atrielienz
      link
      English
      24 months ago

      No no, that’s the point. You’re not supposed to fuck kids.

  • katy ✨
    link
    fedilink
    English
    -124 months ago

    imagine going to jail just because you refused to address the child abuse and csam on your own network.

    • @Doorbook
      link
      English
      54 months ago

      Imagine only targetting Telegram and not Meta and Twitter.

      • beefbot
        link
        fedilink
        English
        44 months ago

        Putin loves Elon (& got him to ruin Xhitter apparently) so no, Elon won’t get arrested

      • katy ✨
        link
        fedilink
        English
        34 months ago

        meta does get pointed out but not as much since meta actually does things to combat csam.

        twitter gets called out ALL the time mostly because elon himself is intervening to reinstate people who share csam because he firedall the trust and safety teams.

        • @Cryophilia
          link
          English
          14 months ago

          Call me when Elon is arrested lol

      • @HauntedCupcake
        link
        English
        24 months ago

        Huh, it’s maybe as if, nooooo… it couldn’t be, the Zuck and Elon are my trusted friends, my confidants, they wouldn’t. They couldn’t. No way in hell they’d sell or otherwise compromise my personal data

    • @JigglySackles
      link
      English
      184 months ago

      Those programs are about mass surveillance and are wrapping themselves in the sheep wool of “protecting kids”

      • @atrielienz
        link
        English
        -64 months ago

        Doesn’t mean they shouldn’t moderate.

        • @pressanykeynow
          link
          English
          64 months ago

          Why should they? Should every mail(physical or not) you receive be opened and read? Should the government have access to everything you do on your phone or pc? Should the government moderate your house? You are full 1984.

          • @atrielienz
            link
            English
            -1
            edit-2
            4 months ago

            Even Facebook doesn’t allow CSAM in public profiles. You can’t just pull up Facebook and see that on your regular feed. Closed groups are a different story. Why should this be different?

            Mind you I’m not saying that the CEO should be criminally responsible for what users on the platform post. I’m pointing out that moderation is a thing even on some of the worst offenders in the space.

            • @pressanykeynow
              link
              English
              04 months ago

              You didn’t answer my questions.

              What moderation do you want? And how would you prevent “moderation” from becoming censorship?

              Aren’t there people whose job is to prevent crimes? Why some IT person who has no idea of crime need to do their job?

              • @atrielienz
                link
                English
                1
                edit-2
                4 months ago

                Because your questions aren’t germane to the point I was making. In fact the first question “how would you prevent " moderation” from becoming censorship" is literally answered by my second comment. Facebook already does this with Facebook messenger. But even if they didn’t, Signal has functions to allow encryption.

                So what you’re saying is, criminals who aren’t using encryption (on a platform where encryption features are readily available) don’t deserve to be moderated on a platform where their messages are using a company’s cloud bandwidth. Does the company not have rights? And if we agree that the company has rights then they also have to follow the law.

                Yes there are people who’s jobs are to (not prevent because police and policing is reactionary not preventative) investigate, and try criminals in a court of law for crimes). This was a poor question to ask. You’re literally acting like we don’t employ thousands of people over various social media and messaging platforms to review and moderate things like CSAM.

                The gist for me is criminals gonna do criminal things but at the end of the day these are our public spaces and just because I don’t want to be surveilled in public or live in a police state doesn’t mean that I want criminals not to be prosecuted for crimes they commit just because someone cares more about their bottom line than they do about moderation of a messaging platform they provide to the public.

                We aren’t talking about end to end encrypted messages here. We’re talking about messages with no such encryption that can be viewed by anyone. There are literally public groups being used by Terrorist organizations on Signal. And while Signal has repeatedly refused to give up encryption keys for the ones that are using encryption (as they should), any criminal that isn’t is not protected by it and should be moderated.