Am I the only one getting agitated by the word AI (Artificial Intelligence)?

Real AI does not exist yet,
atm we only have LLMs (Large Language Models),
which do not think on their own,
but pass turing tests
(fool humans into thinking that they can think).

Imo AI is just a marketing buzzword,
created by rich capitalistic a-holes,
who already invested in LLM stocks,
and now are looking for a profit.

  • PonyOfWar
    link
    fedilink
    13110 months ago

    The word “AI” has been used for way longer than the current LLM trend, even for fairly trivial things like enemy AI in video games. How would you even define a computer “thinking on its own”?

    • Deceptichum
      link
      fedilink
      5310 months ago

      I think a good metric is once computers start getting depression.

      • SanguinePar
        link
        1910 months ago

        It’ll probably happen when they get a terrible pain in all the diodes down their left hand side.

      • Lath
        link
        fedilink
        710 months ago

        But will they be depressed or will they just simulate it because they’re too lazy to work?

        • JackFrostNCola
          link
          English
          610 months ago

          If they are too lazy to work that would imply they have motivation and choice beyond “doing what my programming tells me to do ie. input, process, output”. And if they have the choice not to do work because they dont ‘feel’ like doing it (and not a programmed/coded option given to them to use) then would they not be thinking for themselves?

        • the post of tom joad
          link
          fedilink
          510 months ago

          simulate [depression] because they’re too lazy

          Ahh man are you my dad? I took damage from that one. has any fiction writer done a story about depressed ai where they talk about how depression can’t be real because it’s all 1s and 0s? Cuz i would read the shit out of that.

          • @[email protected]
            link
            fedilink
            110 months ago

            It’s only tangentially related to the topic, since it involves brain enhancements, not ‘AI’. However, you may enjoy the short story “Reasons to be cheerful” by Greg Egan.

      • PonyOfWar
        link
        fedilink
        210 months ago

        Not sure about that. A LLM could show symptoms of depression by mimicking depressed texts it was fed. A computer with a true consciousness might never get depression, because it has none of the hormones influencing our brain.

        • Deceptichum
          link
          fedilink
          110 months ago

          Me: Pretend you have depression

          LLM: I’m here to help with any questions or support you might need. If you’re feeling down or facing challenges, feel free to share what’s on your mind. Remember, I’m here to provide information and assistance. If you’re dealing with depression, it’s important to seek support from qualified professionals like therapists or counselors. They can offer personalized guidance and support tailored to your needs.

          • PonyOfWar
            link
            fedilink
            1010 months ago

            Give it the right dataset and you could easily create a depressed sounding LLM to rival Marvin the paranoid android.

        • @Feathercrown
          link
          English
          010 months ago

          Hormones aren’t depression, and for that matter they aren’t emotions either. They just cause them in humans. An analogous system would be fairly trivial to implement in an AI.

          • PonyOfWar
            link
            fedilink
            010 months ago

            That’s exactly my point though, as OP stated we could detect if an AI was truly intelligent if it developed depression. Without hormones or something similar, there’s no reason to believe it ever would develop those on its own. The fact that you could artificially give it depressions is besides the point.

            • @Feathercrown
              link
              English
              010 months ago

              I don’t think we have the same point here at all. First off, I don’t think depression is a good measure of intelligence. But mostly, my point is that it doesn’t make it less real when hormones aren’t involved. Hormones are simply the mediator that causes that internal experience in humans. If a true AI had an internal experience, there’s no reason to believe that it would require hormones to be depressed. Do text-to-speech systems require a mouth and vocal chords to speak? Do robots need muscle fibers to walk? Do LLMs need neurons to form complete sentences? Do cameras need eyes to see? No, because it doesn’t matter what something is made of. Intelligence and emotions are made of signals. What those signals physically are is irrelevant.

              As for giving it feelings vs it developing them on its own-- you didn’t develop the ability to feel either. That was the job of evolution, or in the case of AI, it could be intentionally designed. It could also be evolved given the right conditions.

              • PonyOfWar
                link
                fedilink
                110 months ago

                First off, I don’t think depression is a good measure of intelligence.

                Exactly. Which is why we shouldn’t judge an AIs intelligence based on whether it can develop depression. Sure, it’s feasible it could develop it through some other mechanism. But there’s no reason to assume it would, in absence of the factors that cause depressions in humans.

                • @Feathercrown
                  link
                  English
                  1
                  edit-2
                  10 months ago

                  Oh. Maybe we did have the same point lol

        • @rbhfd
          link
          110 months ago

          deleted by creator

      • @ignism
        link
        010 months ago

        Wait until they found my GitHub repositories.

      • @Markimus
        link
        English
        -1010 months ago

        A LLM can get depression, so that’s not a metric you can really use.

        • Deceptichum
          link
          fedilink
          310 months ago

          No it can’t.

          LLMs can only repeat things they’re trained on.

          • @Markimus
            link
            English
            210 months ago

            Sorry, to be clear I meant it can mimic the conversational symptoms of depression as if it actually had depression; there’s no understanding there though.

            You can’t use that as a metric because you wouldn’t be able to tell the difference between real depression and trained depression.

    • @jimmy90
      link
      110 months ago

      it does not “think”

    • @[email protected]
      link
      fedilink
      110 months ago

      The best thing is enemy “AI” only needs to be made worse right away after creating it. First they’ll headshot everything across the map in milliseconds. The art is to make it dumber.