I asked someone this question before and they said it was a really stupid question and I’m not sure why so thought I would ask it here…

What’s going to happen when AI becomes really advanced? Is there a plan for what all of the displaced people are going to do? Like for example administrative assistance, receptionist, cashiers, office workers, White collar people. Is there going to be some sort of retraining program of some sort to get people cross-trained into other careers like nursing or other careers that have not yet been automated? Or are people just going to lose their homes, be evicted and is there going to be like some sort of mass eviction and homelessness downstream effect because people can’t find any work?

  • @[email protected]
    link
    fedilink
    104 months ago

    My very limited understanding, is simply that LLMs are not an early iteration of AGI.

    In the same way automobiles are not an early iteration of aeroplanes. They use some of the same tech but before there were aeroplanes no one really knew what was possible.

    It’s true that computers get faster and more amazing, but that’s not an indication that AGI is possible.

    • @Valmond
      link
      14 months ago

      True, but that doesn’t mean AGI isn’t possible. We already have wetware thats very intelligent for example.

      • @[email protected]
        link
        fedilink
        34 months ago

        I don’t think it’s possible to demonstrate whether or not AGI will be possible until it exists in some form.

        • @Valmond
          link
          -14 months ago

          Why?

          Computers are getting smarter and smarter so why not?

            • @Valmond
              link
              04 months ago

              No one said there was evidence for it, you’re just goalpost-moving.

              • @[email protected]
                link
                fedilink
                24 months ago

                My comment is a reference to the well known axiom that you can not provide evidence for a negative. It’s not possible to provide evidence that AGI is not possible, we must content ourselves with the lack of evidence that it is possible.