• @pavnilschanda
    link
    36 months ago

    They become not as useful when users rely on the LLMs to avoid hallucinations as much as possible