• @[email protected]
    link
    fedilink
    368 months ago

    Not to mention their content recommendation algorithms will specifically push more of this “content” to these people, heightening the problem immensely

    • @j4k3
      link
      English
      108 months ago

      Wholly See

  • @dustyData
    link
    English
    278 months ago

    Someone at Meta is panickly deleting server logs and emails right now.

    But on a more serious note. While it is true that child trafficking and exploitation do happen on Meta’s platforms and that I also agree that they do very little, next to nothing, to stop it, borderline accidentally facilitate it. On the other hand, this is just another “won’t you think of the children” political attempt to destroy net neutrality. They intentionally include and argue a point from a Title that is part of the Telecommunications act. Which granted, is outdated and it did threw us into this hell dystopia of corporate domination that is the Internet today. But this is just an attempt to legislate via courts. They probably don’t give a fuck about protecting any children but are instead attempting to exert political control over the internet. Another Democrat fulfilling the Republican agenda for them.

    • @BonesOfTheMoonOP
      link
      108 months ago

      It would be a simple answer to legislate Facebook into actually modding their site for this content.

      • @[email protected]
        link
        fedilink
        -48 months ago

        But they literally cannot moderate their platform. The amount of data that Facebook sees every minute would bankrupt any company if they had to actually hire enough people to go through all that content and determine which is fine and which isn’t. And that isn’t even taking into consideration the mental and emotional damage that a person will go through just seeing all the vile and despicable shit that gets posted. AI moderation isn’t advanced enough and the human moderation cost is so great that the giant social media companies will pretty much never be able to self moderate. Reddit was only able to moderate itself (to an extent) because they had an endless supply of free mods. Facebook doesn’t have that same luxury.

        • @breadsmasher
          link
          English
          188 months ago

          Sounds like they should be bankrupt then

          • @[email protected]
            link
            fedilink
            88 months ago

            Why pay $5 million to pay 100 mods $50k/year when you can just pay a few hundred thousand in fines while you let the government move the walls of your garden for you.

        • @BonesOfTheMoonOP
          link
          128 months ago

          I guess. But they could do a better job with user reported content which they very much don’t.

          • @[email protected]
            link
            fedilink
            68 months ago

            iirc meta on its own spends billions on content moderation, much more than other companies generally do. the problem is with content moderation, you only see the stuff they miss and not the stuff they already filtered out.

            on the topic of weeding out CSAM, an example of where a company gave up on it is nintendo suprisingly. when they had flipnote(a 3ds application where you can send post it nores to others) was used by predators in japan to lure children. Nintendo deemed it not moderatable and since then, removed. flipnote and no chat replacement has since then replaced it functionally.

            moderation is super tough and you can hear some really fucked up stories these people go through, even ones who have to go through more content (e.g people who have to filter out content in china due to government surveillance) has and how it affevted their lives.

            • @BonesOfTheMoonOP
              link
              48 months ago

              I’ve reported probably a thousand pictures of swastika tattoos and shit they don’t remove, and people calling people homophobic slurs. I don’t think anyone reviews those reports.

              • @[email protected]
                link
                fedilink
                28 months ago

                because on the list of stuff theyre filtering out, thats probably low on their list when compared to content like CSAM or actual murder, which gets them into legal problems if that kind of content gets wild.

        • @PrinceWith999Enemies
          link
          48 months ago

          That’s what externalization looks like. In the fossil fuel industry, it’s creating polluting products without having to bear the costs. In chemical companies, it’s physically polluting the environment. Same with mining companies, etc.

          In social media, it is a refusal to manage content in a responsible manner, whether it’s CSAM or disinformation campaigns or hate speech. That externalization is what allows them to pay the salaries that they do, and invest in r&d, and increase their stock values to ridiculous levels. Meta is a trillion dollar company and it needs to rebalance its priorities.

        • I Cast Fist
          link
          fedilink
          38 months ago

          But they literally cannot moderate their platform

          They can, but doing so will affect profits. They used to outsource moderation to Kenyans, who got paid in pennies. Sama, the company doing said “moderation”, apparently stopped offering that kind of work.

          Worth noting: FB and fuckzuck knew that moderation would be a problem for a big platform with millions of daily users. They didn’t care back in 2012, they don’t care now. “Not our problem”, for all intents and purposes, just like their nonexistant customer support.

          In the corporate world, profits are always more important than safety, health and other civilian nonsense. Last I checked instagram via the app, I saw 3 ads for obvious pyramid schemes, not too different from my previous check in 2023. Hey, scammers are paying for ad space, why should zuck care?

  • @[email protected]
    link
    fedilink
    168 months ago

    The lawsuit claims that Meta allows and fails to detect the trafficking of children and “enabled adults to find, message and groom minors, soliciting them to sell pictures or participate in pornographic videos”, concluding that “Meta’s conduct is not only unacceptable; it is unlawful”.

    Yeah, I’m gonna go out on a limb and say there’s prolly not a lot of minors on FB. The kids are on other platforms. FB is for olds and has been for some time.

    • I Cast Fist
      link
      fedilink
      58 months ago

      There’s a lot of minors on Instagram, which I suspect ends up being the main target.

    • @BonesOfTheMoonOP
      link
      38 months ago

      I’ve seen several accounts but how active they are I am not sure. I definitely think Tiktok is more popular.

    • @lepinkainen
      cake
      link
      38 months ago

      Have you seen some of the accounts on Instagram? They’re not even trying to hide it.

  • @breadsmasher
    link
    English
    78 months ago

    im sure zucker the fucker is preparing his wrists for the gentlest of slaps

    • @j4k3
      link
      English
      18 months ago

      The golden wrists of political grandstanding

  • tygerprints
    link
    fedilink
    28 months ago

    That and the republican party. Tim Ballard approves this message.