Source

I see Google’s deal with Reddit is going just great…

  • @[email protected]
    link
    fedilink
    English
    6
    edit-2
    8 months ago

    Couldn’t that describe 95% of what LLMs?

    It is a really good auto complete at the end of the day, just some times the auto complete gets it wrong

    • @[email protected]
      link
      fedilink
      English
      38 months ago

      Yes, nicely put! I suppose ‘hallucinating’ is a description of when, to the reader, it appears to state a fact but that fact doesn’t at all represent any fact from the training data.