• @[email protected]
    link
    fedilink
    English
    07 months ago

    It’s not a monster. It doesn’t vaguely resemble a monster.

    It’s a ridiculously simple tool that does not in any way resemble intelligence and has no agency. LLMs do not have the capacity for harm. They do not have the capability to invent or discover (though if they did, that would be a massive boon for humanity and also insane to hold back). They’re just a combination of a mediocre search tool with advanced parsing of requests and the ability to format the output in the structure of sentences.

    AI cannot do anything. If your concern is allowing AI to release proteins into the wild, obviously that is a terrible idea. But that’s already more than covered by all the regulation on research in dangerous diseases and bio weapons. AI does not change anything about the scenario.

    • @Carrolade
      link
      English
      17 months ago

      I largely agree, current LLMs add no capabilities to humanity that it did not already possess. The point of the regulation is to encourage a certain degree of caution in future development though.

      Personally I do think it’s a little overly broad. Google search can aid in a cyber security attack. The kill switch idea is also a little silly, and largely a waste of time dreamed up by watching too many Terminator and Matrix movies. While we eventually might reach a point where that becomes a prudent idea, we’re still quite far away.

      • @[email protected]
        link
        fedilink
        English
        3
        edit-2
        7 months ago

        We’re not anywhere near anything that has anything in common with human level intelligence, or poses any threat.

        The only possible cause for support of legislation like this is either a completely absence of understanding of what the technology is combined with treating Hollywood as reality (the layperson and probably most legislators involved in this), or an aggressive market control attempt through regulatory capture by big tech. If you understand where we are and what paths we have forward, it’s very clear that there’s only harm that this can do.