• BeautifulMind ♾️
    link
    English
    2
    edit-2
    1 year ago

    The issue with this is holding tech companies liable for every possible infraction

    That concern was the basis for section 230 of the 1996 Communications Decency Act, which is in effect in the USA but is not the law in places like, say the EU. It made sense at the time, but today it is desperately out of date.

    Today we understand that in absolving platforms like Meta of their duty of care to take reasonable steps to not cause harm to their customers, their profit motive would guide them to look the other way when their platform is used to disseminate disinformation about vaccines that gets people killed, that the money would have them protecting Nazis, that algorithms intended to promote engagement would become a tool not just to advertisers but to propagandists and information warfare people.

    I’m not particularly persuaded that if in the US there is reform to section 230 of the Communications Decency act, that it would doom nonprofit social media like most of the fediverse- if you look around at all, most of it already follows a well-considered duty-of-care standard that provides its operators substantial legal protection from liability for what 3rd parties post to their platforms. Also if you consider even briefly, that is the standard in effect in much of Europe and social media still exists- it’s just less-profitable and has fewer nazis.

    • @[email protected]
      link
      fedilink
      English
      21 year ago

      I feel like you can’t really change 230, you need to to instead legislatate differently. There is room for more criminal liability when things go wrong I think. But cival suits in the US can be really bogus. Like someone could likely sue a mastodon instance for turning their kid trans and win without section 230

      • BeautifulMind ♾️
        link
        English
        11 year ago

        I’m with you on the legislate differently part.

        The background of Section 230©(2) is an unfortunate 1995 court ruling that held that if you moderate any content whatsoever, you should be regarded as its publisher (and therefore ought to be legally liable for whatever awful nonsense your users put on your platform). This perversely created incentive for web forum operators (and a then-fledgling social media industry) to not moderate content at all in order to gain immunity from liability- and that in turn transformed broad swathes of the social internet into an unmoderated cesspool full of Nazis and conspiracy theories and vaccine disinformation, all targeting people with inadequate critical thinking faculties to really process it responsibly.

        The intent of 230©(2) was to encourage platform operators to feel safe to moderate harmful content, but it also protects them if they don’t. The result is a wild-west, if you will, in which it’s perfectly legal for social media operators in the USA to look the other way when known-unlawful use of their platforms (like advertising stolen goods, or sex trafficking, or coordinating a coup attempt, or making porn labeled ‘underage’ searchable) goes on.

        It was probably done in good faith, but in hindsight it was naïve and carved out the American internet as a magical zone of no-responsibility.

        • @[email protected]
          link
          fedilink
          English
          11 year ago

          This is not really what 230 does, sites still face criminal liability were needed, like if I made a site that had illegal content I could still be arrested and have my server seized, repealing 230 would legit just let Ken Paxton launch a multi state lawsuit suing a large list of queer mastodon instance for transing minors. Without 230 it would be lawsuit land and sites would censor anything that wasnt cat photos in an effort to avoid getting sued. Lawsuits are expensive even when you win. If you wanna make social media companies deal with something you gotta setup criminal liability not repeal 230. 230 just protect sites from cival suits not criminal ones.

    • @[email protected]
      link
      fedilink
      English
      21 year ago

      I think generally we need to regulate how algrithums work of this is the case. We need actual legislation and not just law suit buttons. Also meta can slither its way out of any lawsuit, this would really only effect small mastodon instances.