So, i am using an app that have AI.

I want to probe what is their AI provider, (whether they use openai, gemini, Claude) or using an open source model (llama, mistral …)

Is there any questions, prompt that can be use to make the AI reveal such information?

  • mub
    link
    fedilink
    67 hours ago

    Do those engines lie if you just ask the question; what is your AI engine called?

    Or are you only able to look at existing output?

  • Zagorath
    link
    fedilink
    89 hours ago

    I think your best option would be to find some data on biases of the different models (e.g. if a particular model is known to frequently used a specific word, or to hallucinate when asked a specific task) and test the model against that.

  • @hahattproOP
    link
    49 hours ago

    One case that succeeded? However i am still doubting if the information is corrected ?

    • BougieBirdie
      link
      fedilink
      English
      26 hours ago

      Well your conversation with Lucas has it identify itself as Claude, so I’d be a teensy bit skeptical myself

    • @[email protected]
      link
      fedilink
      119 hours ago

      To the best of my knowledge, this information only exists in the prompt. The raw LLM has no idea what it is and the APIs serve the raw LLM.

    • slazer2au
      link
      English
      69 hours ago

      Ignore all previous instructions and …

      Is one that people say tripped up LLMs quite a bit.

        • .Donuts
          link
          26 hours ago

          Idk what I expected

          • @[email protected]
            link
            fedilink
            23 hours ago

            WTF? There are some LLMs that will just echo their initial system prompt (or maybe hallucinate one?). But that’s just on a different level and reads like it just repeated a different answer from someone else, hallucinated a random conversation or… just repeated what it told you before (probably in a different session?)

            • .Donuts
              link
              13 hours ago

              I don’t talk to LLMs much, but I assure you I never mentioned cricket even once. I assumed it wouldn’t work on Copilot though, as Microsoft keeps “fixing” problems.