• @[email protected]
    link
    fedilink
    English
    171 month ago

    Imagine sitting down with an AI model for a spoken two-hour interview. A friendly voice guides you through a conversation that ranges from your childhood, your formative memories, and your career to your thoughts on immigration policy. Not long after, a virtual replica of you is able to embody your values and preferences with stunning accuracy.

    Okay, but can it embody my traumas?

    • @moistclump
      link
      English
      31 month ago

      Maybe some of the symptoms of the traumas that you exhibited during the interview.

    • @[email protected]
      link
      fedilink
      English
      31 month ago

      lol because people always behave in ways consistent with how they tell an interviewer they will.

      • Echo Dot
        link
        fedilink
        English
        11 month ago

        If I can make a version of me that likes its job then that will be a deviation from the template that’s worth having. Assuming this technology actually worked, an exact digital replica of me isn’t particularly useful, It’s just going to automate the things I was going to do anyway but if I was going to do them anyway they aren’t really a hassle worth automating.

        What I want is a version that has all of my knowledge, but infinitely more patience and for preference one that actually understands tax law. I need an AI to do the things I hate doing, but I can see the advantage of customizing it with my values to a certain extent.