• @[email protected]
    link
    fedilink
    English
    43 months ago

    I’m curious to know what happens if you ask ChatGPT to make you a text adventure based on that prompt.

    Not curious enough to try it and play it myself, though.

    • @kat_angstrom
      link
      English
      143 months ago

      It lacks cohesion the longer it goes on, not so much “hallucinating” as it is losing the thread, losing the plot. Internal consistency goes out the window, previously-made declarations are ignored, and established canon gets trounced upon.

      But that’s cuz it’s not AI, it’s just LLM all the way down.

      • @HeyThisIsntTheYMCA
        link
        English
        23 months ago

        just for my ego, how long does it take to lose the plot?

        • @kat_angstrom
          link
          English
          43 months ago

          Depends on complexity and the number of elements to keep track of, and varies between models and people. Try it out for yourself to see! :)

        • @CheeseNoodle
          link
          English
          33 months ago

          Its kind of an exponential falloff, for a few lines it can follow concrete mathematical rules, for a few paragraphs it can remember basic story beats, for a few pages it can just about remember your name.

    • @[email protected]
      link
      fedilink
      English
      73 months ago

      It works okay for a while, but eventually it loses the plot. The storylines are usually pretty generic and washed out.