• @A_Very_Big_Fan
    link
    English
    18 months ago

    You need that amount of power to provide that service for hundreds of millions of people simultaneously, like ChatGPT. Do you seriously think it takes that amount of equipment and power to output to a single device?

    I linked you to one that runs locally on a phone, dude. Here’s a whole list of pre-trained LLMs you can run on an average computer. 🤷‍♀️

    • @mojofrododojo
      link
      English
      08 months ago

      your tiny PC and phone based LLMs are going to be fuck-all useful after the apoc. Oh yeah “ClimateBert’s Hugging Face” sounds like just the thing to help you survive.

      the only significant advantage an LLM is going to offer is the illusion of company, and they only way you’ll get it is a giant data center.

      you’d be better off having wikipedia summarized by a chatbot, but again, it’s gonna require grunt and storage.

      just because something can be stripped down to run on any device doesn’t make it useful.

      • @A_Very_Big_Fan
        link
        English
        18 months ago

        So you do unironically think it takes that amount of equipment and power to output to a single device lmao

        • @mojofrododojo
          link
          English
          08 months ago

          I can’t tell if you’re fucking dense or can’t read.

          A LLM RUN ON A PHONE WILL DO YOU FUCKALL GOOD.

          you uninformedly think you can run an AI worth a damn on your phone - and the corpus to teach it?

          fuck off, you stupid git. good luck with your HAL 9. You’re gonna walk through the apocalypse with a moron. Which fits, you’ll be equals.

          • @A_Very_Big_Fan
            link
            English
            1
            edit-2
            8 months ago

            Here’s a RaspberryPi doing a variety of tasks with various LLMs, like programming and accurately describing a picture.

            There’s a literal mountain of evidence of what these models can do. It’s been fun making you rage :3