• @[email protected]
    link
    fedilink
    English
    5
    edit-2
    4 months ago

    It’s a LLM. It doesn’t “know” what it’s talking about. Gemini is designed to write long nuanced answers to ‘every’ question, unless prompted otherwise.

    • @[email protected]
      link
      fedilink
      English
      -24 months ago

      Not knowing what it’s talking about is irrelevant if the answer is correct. Humans that knows what they’re talking about are just as prone to mistakes as an LLM is. Some could argue that in much more numerous ways too. I don’t see the way they work that different from each other as most other people here seem to.