• @pavnilschanda
      link
      37 months ago

      They become not as useful when users rely on the LLMs to avoid hallucinations as much as possible