• @pavnilschanda
      link
      39 months ago

      They become not as useful when users rely on the LLMs to avoid hallucinations as much as possible