• @stoicmaverick
    link
    English
    223 days ago

    Can someone explain the Pale Horse reference?

    • @[email protected]
      link
      fedilink
      English
      38
      edit-2
      3 days ago

      In the Bible’s book of revelations, John (the author) is witnessing the end of the world and sees four horsemen being unleashed upon the world to spread a curse/trial/whatever wherever they ride. Each horseman brings with them something different- famine, disease, war (or strife), and death. Death is the last, IIRC, and rides upon a pale horse. I think that’s what they’re referencing. This person is saying that openAI is going to die soon.

      • @[email protected]
        link
        fedilink
        English
        73 days ago

        this is correct as to the background of the term itself, the reason ed uses it here is because it is the term that he selected some months ago when he listed “some likely pale horses that signal the bubble popping”

      • @[email protected]
        link
        fedilink
        English
        -4
        edit-2
        3 days ago

        it won’t. its backed by microsoft. they can literally afford to burn the cash on this while it becomes profitable, and it will AI has so many low hanging fruits to optimize its insane.

        • @[email protected]
          link
          fedilink
          English
          39
          edit-2
          3 days ago

          So many low-hanging fruits. Unbelievable fruits. You wouldn’t believe how low they’re hanging.

            • @mojofrododojo
              link
              English
              153 days ago

              and we have concepts for even lower hanging fruits. beautiful concepts.

          • @[email protected]
            link
            fedilink
            English
            -18
            edit-2
            3 days ago

            quants are pretty basic. switching from floats to ints (faster instruction sets) are the well known issues. both those are related to information theory, but there are other things I legally can’t mention. shrug. suffice to say the model sizes are going to be decreasing dramatically.

            edit: the first two points require reworking the base infrastructure to support which is why they havent hit widespread adoption. but the research showing that 3 bits is as good as 64 is intuitive once you tie the original inspiration for some of the AI designs. that reduction alone means you can get 21x reduction in model size is pretty solid.

            • @[email protected]
              link
              fedilink
              English
              243 days ago

              both those are related to information theory, but there are other things I legally can’t mention. shrug.

              hahahaha fuck off with this. no, the horseshit you’re fetishizing doesn’t fix LLMs. here’s what quantization gets you:

              • the LLM runs on shittier hardware
              • the LLM works worse too
              • that last one’s kinda bad when the technology already works like shit

              anyway speaking of basic information theory:

              but the research showing that 3 bits is as good as 64 is intuitive once you tie the original inspiration for some of the AI designs.

              lol

              • @[email protected]
                link
                fedilink
                English
                73 days ago

                It’s actually super easy to increase the accuracy of LLMs.

                import pytorch # or ollama or however you fucking dorks use this nonsense
                from decimal import Decimal
                

                I left out all the other details because it’s pretty intuitive why it works if you understand why floats have precision issues.

              • @[email protected]
                link
                fedilink
                English
                12
                edit-2
                3 days ago

                Honestly, the research showing that a schlong that’s 3mm wide is just as satisfying as one that’s 64 is intuitive once you tie the original inspiration for some of the sex positions.

              • @[email protected]
                link
                fedilink
                English
                103 days ago

                I have seen these 3 bit ai papers on hacker news a few times. And the takeaway apparently is: the current models are being pretty shitty at what we want them to do, and we can reach a similar (but slightly worse) level of shittyness with 3 bits.

                But that doesn’t say anything about how both technologies could progress in the future. I guess you can compensate for having only three bits to pass between nodes by just having more nodes. But that doesn’t really seem helpful, neither for storage nor compute.

                Anyways yeah it always strikes me as a kind of trend that maybe has an application in a very specific niche but is likely bullshit if applied to the general case

                • @[email protected]
                  link
                  fedilink
                  English
                  32 days ago

                  Far as I can tell, the only real benefit here is significant energy savings, which would take LLMs from “useless waste of a shitload of power” to “useless waste of power”.

                • @[email protected]
                  link
                  fedilink
                  English
                  123 days ago

                  If anything that sounds like an indictment? Like, the current models are so incredibly fucking bad that we could achieve the same with three bits and a ham sandwich

          • @[email protected]
            link
            fedilink
            English
            42 days ago

            “look, Mme Karen, this is definitely not a rotten tomato. it can’t be a rotten tomato, we don’t sell rotten tomatoes. you can see here on the menu that we don’t have rotten tomatoes on offer. and see here, on your receipt, where it says quinoa salad? absolutely not rotten tomatoes!” explains the manager fervently, avoiding a tableward glance at the pungent red blob with as much will as they can muster

        • @stoly
          link
          English
          63 days ago

          They will only pay as long as it benefits them.