• @[email protected]
    link
    fedilink
    English
    105 days ago

    Yeah, and the way it will confidently give you a wrong answer instead of either asking for more information or saying it just doesn’t know is equally annoying.

    • @Jesus_666
      link
      English
      55 days ago

      Because giving answers is not a LLM’s job. A LLM’s job is to generate text that looks like an answer. And we then try to coax framework that into generating correct answers as often as possible, with mixed results.