• @mojofrododojo
    link
    English
    -16 months ago

    oh man this is hilarious. so you think if the AI is local the corpus it’s been taught upon doesn’t need to be stored?

    it’s worse - you need the processing power (many many many gpus requiring enormous amounts of power) coupled to enormous amounts of material to educate the language model on.

    sorry man there aren’t any shortcuts.

    • @A_Very_Big_Fan
      link
      English
      06 months ago

      so you think if the AI is local the corpus it’s been taught upon doesn’t need to be stored?

      You don’t need to store the training data. What’s “hilarious” is how confidently incorrect you are.

      This, for example, is a model small enough to run on your phone that was trained on ~895GB of data.

      Even if I did need to keep all of that data, and even if I also needed to train it myself, what’s stopping me from just stealing all the equipment I need if I’m the last person on earth???

      • @mojofrododojo
        link
        English
        0
        edit-2
        6 months ago

        and what use would this be to someone after the world ends?

        AGAIN THE POWER REQUIREMENTS BELLEND.

        Do you have a fusion reactor in your pocket?

        And a freshwater source the size of a lake? because you’ll need both to run the data center required to run anything USEFUL.

        phew… steal all the equipment you want, you wouldn’t be able to do shit all with it. just keeping a single data center UP would be a herculean task for a single person - without robots or trained monkeys or alien buds to help you I’m exceptionally dubious.

        I can just imagine you pushing a shopping cart of 4080ti’s through the hellscape towards a data center thinking “shit yeah I GOT this apocalypse solved” l-o-fucking-l

        • @A_Very_Big_Fan
          link
          English
          16 months ago

          You need that amount of power to provide that service for hundreds of millions of people simultaneously, like ChatGPT. Do you seriously think it takes that amount of equipment and power to output to a single device?

          I linked you to one that runs locally on a phone, dude. Here’s a whole list of pre-trained LLMs you can run on an average computer. 🤷‍♀️

          • @mojofrododojo
            link
            English
            06 months ago

            your tiny PC and phone based LLMs are going to be fuck-all useful after the apoc. Oh yeah “ClimateBert’s Hugging Face” sounds like just the thing to help you survive.

            the only significant advantage an LLM is going to offer is the illusion of company, and they only way you’ll get it is a giant data center.

            you’d be better off having wikipedia summarized by a chatbot, but again, it’s gonna require grunt and storage.

            just because something can be stripped down to run on any device doesn’t make it useful.

            • @A_Very_Big_Fan
              link
              English
              16 months ago

              So you do unironically think it takes that amount of equipment and power to output to a single device lmao

              • @mojofrododojo
                link
                English
                06 months ago

                I can’t tell if you’re fucking dense or can’t read.

                A LLM RUN ON A PHONE WILL DO YOU FUCKALL GOOD.

                you uninformedly think you can run an AI worth a damn on your phone - and the corpus to teach it?

                fuck off, you stupid git. good luck with your HAL 9. You’re gonna walk through the apocalypse with a moron. Which fits, you’ll be equals.

                • @A_Very_Big_Fan
                  link
                  English
                  1
                  edit-2
                  6 months ago

                  Here’s a RaspberryPi doing a variety of tasks with various LLMs, like programming and accurately describing a picture.

                  There’s a literal mountain of evidence of what these models can do. It’s been fun making you rage :3