• @4AV
    link
    English
    219 months ago

    It doesn’t have “memory” of what it has generated previously, other than the current conversation. The answer you get from it won’t be much better than random guessing.

      • @sep
        link
        English
        149 months ago

        Ignoring the huge privacy/liabillity issue… there are other llm’s then chatgpt.

      • @BetaDoggo_
        link
        English
        19 months ago

        The model is only trained to handle 4k tokens, roughly 2000 words depending on complexity. Even if it had a log of everything asked it wouldn’t be able to use any of it.