• @kat_angstrom
    link
    English
    143 months ago

    It lacks cohesion the longer it goes on, not so much “hallucinating” as it is losing the thread, losing the plot. Internal consistency goes out the window, previously-made declarations are ignored, and established canon gets trounced upon.

    But that’s cuz it’s not AI, it’s just LLM all the way down.

    • @HeyThisIsntTheYMCA
      link
      English
      23 months ago

      just for my ego, how long does it take to lose the plot?

      • @kat_angstrom
        link
        English
        43 months ago

        Depends on complexity and the number of elements to keep track of, and varies between models and people. Try it out for yourself to see! :)

      • @CheeseNoodle
        link
        English
        33 months ago

        Its kind of an exponential falloff, for a few lines it can follow concrete mathematical rules, for a few paragraphs it can remember basic story beats, for a few pages it can just about remember your name.