• Maxnmy's
    link
    English
    767 months ago

    Stop plugging LLMs into everything! They are designed to make up plausible sounding nonsense.

    • @gap_betweenus
      link
      English
      377 months ago

      Seems like Facebook is the right place for them than.

    • @slaacaa
      link
      English
      217 months ago

      LLMs are very useful for synthesizing information, e.g. sumamrizing long texts. Yet every company is actually pushing to use it to create more text, which as you say is at least partly nonsense.

      It shows against the difference of what users need (quick access to accurate information) vs what these companies eant for us (glue your eyeballs to the screen for the longest possible time by e.g. overwhelming you with information, regardless of the quality)

      • @[email protected]
        link
        fedilink
        English
        47 months ago

        Well it can be great at making text too, but the usecase has to be very good. Right now lots of companies in the B2B space are using LLMs as a middle layer to chat bots and navigation systems to enhance how they function. They are also being used to create unique lists and inputs for certain systems. However on the consumer side the usecase is pretty mixed with a lot of big companies just muddying their offerings instead of bringing any real value.

    • @stellargmite
      link
      English
      127 months ago

      There is a time and place for nonsense, and this isn’t it. I guess it being plausible sounding is the issue.

    • @Sorgan71
      link
      English
      -177 months ago

      no different from what human brains do

      • @ShittyBeatlesFCPres
        link
        English
        227 months ago

        Aside from knowledge, context, ability to reason, and spatial awareness.

        • @Sorgan71
          link
          English
          -197 months ago

          All of those are just products of the same learning algorithm

          • @ShittyBeatlesFCPres
            link
            English
            77 months ago

            Consciousness is not a computer program. Neurons don’t use binary. I’d love it if we had computers that could do squirrel things perfectly but we don’t even have that.

            • @QuaternionsRock
              link
              English
              07 months ago

              I can appreciate that contemporary neural networks are very different from organic intelligence, but consciousness is most definitely equivalent to a computer program. There are two things preventing us from reproducing it:

              1. We don’t know nearly enough about how the human mind (or any mind, really) actually works, and
              2. Our computers do not have the capacity to approximate consciousness with any meaningful degree of accuracy. Floating point representations of real numbers are not an issue (after all, you can always add more bits), but the sheer scale and complexity of the brain is a big one.

              Also, for what it’s worth, most organic neurons actually do use binary (“one bit”) activation, while artificial “neurons” use a real-valued activation function for a variety of reasons, the biggest two being that (a) training algorithms require differentiable models, and (b) binary activation functions do not yield a lot of information per neuron while requiring effectively the same amount of memory.

            • spielhoelle
              link
              fedilink
              -87 months ago

              @ShittyBeatlesFCPres @Sorgan71 well, actually those things are not so far apart. Neural networks have their names not just by accidents, name giving neurons work similar to braincells. Also on a non-ai level you could compare the RAM easily yo put short term memory etc.

              • my_hat_stinks
                link
                fedilink
                English
                13
                edit-2
                7 months ago

                The name is an analogy, neural networks do not work in the same way as biological neurons. They were designed by computer scientists, not biologists.

                RAM is so far removed from biological short term memory both in how it works and how it’s used that the comparison doesn’t even make sense. The only similarity is that they’re short term information/data stores, so it’s equally valid to compare them to a drawing in the sand of a beach.

              • @ShittyBeatlesFCPres
                link
                English
                37 months ago

                A piece of friendly advice is to not say “Well, actually…” on the Internet because that’s a meme about know-it-alls. I (and probably everyone on Lemmy) has a tendency to “Well, actually” people and it’s one of those things where people will discount your argument before it begins.

                That aside, I do think we’re trying to model the brain using the best tools we have. I suspect the next 100 years will see a revolution in biology that can be compared to previous centuries seeing huge leaps in the understanding of physics, electromagnetism, and the immune system. No one in 1900 could not have ever foreseen us mapping the human genome.

                So: I wouldn’t be shocked if neural networks caught up with humans in our lifetimes. But we’re basically trying to reverse engineer it using a lot of electricity and I doubt we’ll get to squirrel level intelligence in my lifetime, much less human level. But who knows? “There are decades where nothing happens; and, there are weeks where decades happen.” (A quote from V.I. Lenin. I don’t want to be political here but hopefully we can all agree he made some history happen.)

            • @Sorgan71
              link
              English
              -107 months ago

              Binary neurons are still neurons