• @4AV
    link
    English
    211 year ago

    It doesn’t have “memory” of what it has generated previously, other than the current conversation. The answer you get from it won’t be much better than random guessing.

      • @sep
        link
        English
        141 year ago

        Ignoring the huge privacy/liabillity issue… there are other llm’s then chatgpt.

      • @BetaDoggo_
        link
        English
        11 year ago

        The model is only trained to handle 4k tokens, roughly 2000 words depending on complexity. Even if it had a log of everything asked it wouldn’t be able to use any of it.