• @muntedcrocodile
    link
    English
    39 months ago

    What an excellent presedent to set cant possibly see how this is going to become authoritarian. Ohh u didnt report someone ur also guilty cant see any problems with this.

    • @[email protected]
      link
      fedilink
      English
      379 months ago

      Ohh u didnt report someone ur also guilty cant see any problems with this.

      That’s… not what this is about, though?

      “However, plaintiffs contend the defendants’ platforms are more than just message boards,” the court document says. “They allege they are sophisticated products designed to be addictive to young users and they specifically directed Gendron to further platforms or postings that indoctrinated him with ‘white replacement theory’,” the decision read.

      This isn’t about mandated reporting, it’s about funneling impressionable people towards extremist content.

      • @[email protected]
        link
        fedilink
        English
        239 months ago

        And they profit from it. That’s mentioned there too, and it makes it that much more infuriating. They know exactly what they’re doing, and they do it on purpose, for money.

        And at the end of the day, they’ll settle (who are the plaintiffs? Article doesn’t say) or pay some relatively inconsequential amount, and they’ll still have gained a net benefit from it. Another case of cost-of-doing-business.

        Would’ve been free without the lawsuit even. Lives lost certainly aren’t factored in otherwise.

      • Kraiden
        link
        fedilink
        169 months ago

        Youtube Shorts is the absolute worst for this. Just recently it’s massively trying to push transphobic BS at me, and I cannot figure out why. I dislike, report and “do not recommend this channel” every time, and it just keeps shoving more at me. I got a fucking racist church sermon this morning. it’s broken!

        • @[email protected]
          link
          fedilink
          English
          59 months ago

          Don’t dislike it just hit do not recommend, also don’t open comments - honestly the best way is just to skip past as fast as you can when you set one, the lower time with it on your screen YNt less the algo thinks you want it.

          I never really see that on YouTube unless I’ve been on related topics recently and it goes pretty quick when you don’t interact. Yes it’s shifty but they’re working on a much better system using natural language with an llm but it’s a complex problem

        • @shalafi
          link
          English
          19 months ago

          I am not discounting anyone’s experience. I am not saying this isn’t happening. But I don’t see it.

          LiberalGunNut™ here! You would think watching gun related videos would lead me down a far-right rabbit hole. Here’s my feed ATM.

          Meh. History, gun comparisons, chemistry, movies, whatever. Nothing crazy. (Don’t watch Brandon any longer, got leaning too right, too political. Video’s about his bid for a Congressional seat in Texas. Not an election conspiracy thing. Don’t care.)

          If anyone can help me understand, I’m listening. Maybe I shy away from the nutcase shit so hard that YouTube “gets” me? Honestly don’t get it.

          • Kraiden
            link
            fedilink
            29 months ago

            So that looks like main long form content. I’m specifically talking about youtube shorts which is Google’s version of TikTok

        • @muntedcrocodile
          link
          English
          09 months ago

          Imagine watchibg let alone even having the option for shorts. Get newpipe there is a sponsorblock version on fdroid no shorts no google tracking no nonsence u dont get comments tho but whatever. It also supports peertube which is nice.

          Report for what? Sure disagree with them about their bullshit but i dont see why u need to report someone just cos u disagree with their opinions.

          • Kraiden
            link
            fedilink
            69 months ago

            Imagine watchibg let alone even having the option for shorts.

            I like shorts for the most part

            Report for what?

            Misinformation and hatespeech mostly. They have some crazy, false pseudoscience to back their “opinions” and they express them violently. Like it or not, these videos “promote hatred against a protected group” and are expressly against youtube TOS. Reporting them is 100% appropriate.

            • @muntedcrocodile
              link
              English
              09 months ago

              I can strongly reccommwnd stop watching ahort form content it has been proven to caise all sorts of mental issues.

              Fair. Also what is a “protected group” what makes it any different from any other grouping?

      • @muntedcrocodile
        link
        English
        -69 months ago

        U can make any common practice and pillar of capitalism sound bad by using the words impressionable and extremist.

        If we remove that it become: funnelling a market towards the further consumption of your product. I.e. marketing

        And yes of cause the platforms are designed to be addictive and are effective at indoctranation but why is that only a problem for certain ideologies shouldnt we be stopping all ideologies from practicing indoctranation of impressionable people should we not be guiding people to as many viewpoints as possible to teach them to think not to swallow someone elses ideas and spew them back out.

        I blame Henry Ford for this whole clusterfuck he lobbied the education system to manufacture an obedient consumer market and working class that doesnt think for itself but simply swallows what its told. The education system is the problem anything else is treating the symptoms not the disease.

        • @[email protected]
          link
          fedilink
          English
          19 months ago

          If we remove that it become: funnelling a market towards the further consumption of your product. I.e. marketing

          And if a company’s marketing campaign is found to be indirectly responsible for a kid shooting up a grocery store, I’m sure we’ll be seeing a repeat of this with that company being the one with a court case being brought against them, what even is this argument?

          • @muntedcrocodile
            link
            English
            -19 months ago

            Isnt the entire gun market indirectly responsible, what about the food the shooters ate? Cant we use the same logic to prssecute anyone of any religion cos most of the religiouse texts support the killing of some group of people.

            Its convenient to ask what the argument is when u ignore 60% of it

            • @[email protected]
              link
              fedilink
              English
              39 months ago

              Did you even read the article we’re discussing, or are you just reading the comments and getting mad?

              1. No decision has been made. This is simply a judge denying the companies’ motion to have this thrown out before going to trial.
              2. This is very much different than “the gun market” being indirectly responsible. This is the equivalent of “the gun market” constantly sending a person pamphlets, calling them, emailing them, whatever else, with propaganda until they ultimately decided to act on it. If that was happening, I think we’d be having the same conversation about that, and whether they should be held accountable.
              3. Whether they’re actually responsible or not (or whether any group is) can be determined in court following all the usual methods. A company getting to say “That’s ridiculous, we’re above scrutiny” is dangerous, and that’s effectively what they were trying to do (which was denied by this judge.)
      • wagesj45
        link
        fedilink
        -89 months ago

        That means that the government is injecting itself on deciding what “extremist” is. I do not trust them to do that wisely. And even if I did trust them, it is immoral for the state to start categorizing and policing ideologies.

        • @givesomefucks
          link
          English
          189 months ago

          Do you understand you’re arguing for violent groups instigating a race war?

          Like, even if you’re ok with white people doing it, you’re also saying ISIS, MS13, any fucking group can’t be labeled violent extremists…

          Some “ideologies” need to be fucking policed

          • HACKthePRISONS
            link
            fedilink
            09 months ago

            anarchists have had to deal with this for over a century. the state can go fuck itself.

          • @muntedcrocodile
            link
            English
            -19 months ago

            Ur missing the point violence should absolutly be policed. Words ideas ideology hell no let isis, ms13, the communists, the nazis, the vegans etc etc etc say what they want. They are all extremists by some definition let them discuss let them argue and the second someone does something violent lock em for the rest of their lives simple.

            What you are suggesting is the policing of ideology to prevent future crime their is an entire book about where that leads to said book simply calls this concept thought crime.

          • wagesj45
            link
            fedilink
            -29 months ago

            Some “ideologies” need to be fucking policed

            Someone wants to start with yours, and they have more support than you know. Be careful what you wish for.

              • wagesj45
                link
                fedilink
                -39 months ago

                Big difference between policing actions and policing thoughts. Declaring some thoughts as verboten and subject to punishment or liability is bad.

                • @[email protected]
                  link
                  fedilink
                  English
                  -49 months ago

                  It’s insane you’re being downvoted by people who would be the first ones silenced.

                  You really think they’re going to use this for himophobes and racists instead of anyone calling for positive socia6 change?

                  Did you not see any of history?

        • @[email protected]
          link
          fedilink
          English
          39 months ago

          That is generally what Governments do. They write laws that say … you can do this but not that. If you do this thats illegal and you will be convicted. Otherwise you wouldnt be able to police things like Mafia and drug cartels. Even in the US their freedom of speech to conspire to committe crimes is criminalised. There is no difference between that and politically motivated ‘extremists’ who conspire to commit crimes. The idealogy is not criminalised the acts that groups plan or conduct are. You are totally fine saying . I dont like x group.

          What its not ok to say is . Lets go out and kill people from x.group.

          The problem is that social media sites use automated processes to decide which messages to put in front of users in the fundamentally same way that a newspaper publisher decides which letters to the editor they decide to put in their newspaper.

          Somehow though Tech companies have argued that because their is no limit on how many posts they can communicate amd hence theoretically they arent deciding what they put in and what they done, that their act of putting some at the top of people’s lists so they are seen is somehow different to the act of the newspaper publisher including a particular letter or not …but the outcome is the same The letter or post is seen by people or not.

          Tech companies argue they are just a commutation network but I never saw a telephone, postal or other network that decided which order you got your phone calls, letters or sms messages. They just deliver what is sent in the order it was sen.

          commercial social media networks are publishers with editorial control - editorial control is not only inclusion/exclusion but also prominence

          There is a fundamental difference in Lemmy or Mastodon in that those decisions (except for any moderation by individual server admins) dont promote or demote any post so therefore dont have any role in whether a user sees a post or not.

        • 520
          link
          fedilink
          19 months ago

          The government is already the one who makes that decision. The only thing new here is a line being drawn with regards to social media’s push towards addiction and echo-chamberism.

        • zeluko
          link
          fedilink
          09 months ago

          umm… isnt the government or rather the judikative already deciding what extremist is?
          How would specifically this be different?

          I can understand the problems thos causes for the platforms, but the government injecting decisions is something you focus on?
          Not to forget the many other places they inject themselves… one could say your daily lifes because… careful now… you live in the country with a government, whaaat?

    • Drusas
      link
      fedilink
      3
      edit-2
      9 months ago

      You could make a good point with better spelling, grammar, and word choice.

      • @muntedcrocodile
        link
        English
        -89 months ago

        Yes i could. I could spend the extra 30seconds fixing it or i could not bother and still have my point comprehendable.