• lurch (he/him)
      link
      fedilink
      English
      159 months ago

      No, that’s an AI generated summary that bing (and google) show for a lot of queries.

      For example, if I search “can i launch a cow in a rocket”, it suggests it’s possible to shoot cows with rocket launchers and machine guns and names a shootin range that offer it. Thanks bing … i guess…

      • swope
        link
        fedilink
        79 months ago

        You think the culture wars over pronouns have been bad, wait until the machines start a war over prepositions!

      • HonkyTonkWoman
        link
        fedilink
        English
        29 months ago

        Purely out of curiosity… what happens if you ask it about launching a rocket in a cow?

      • @kromem
        link
        English
        29 months ago

        You’re incorrect. This is being done with search matching, not by a LLM.

        The LLM answers Bing added appear in the chat box.

        These are Bing’s version of Google’s OneBox which predated their relationship to OpenAI.

        • lurch (he/him)
          link
          fedilink
          English
          19 months ago

          The box has a small i-icon that literally says it’s an AI generated summary

          • @kromem
            link
            English
            1
            edit-2
            9 months ago

            They’ve updated what’s powering that box, see my other response to your similar comment with the image.

          • @kromem
            link
            English
            29 months ago

            Yes, they’ve now replaced the legacy system with one using GPT-4, hence the incorporation of citations in a summary description same as the chat format.

            Try the same examples as in OP’s images.

    • wander1236
      link
      fedilink
      English
      119 months ago

      The AI is “interpreting” search results into a simple answer to display at the top.

      • @lunarul
        link
        English
        39 months ago

        And you can abuse that by asking two questions in one. The summarized yes/no answer will just address the first one and you can put whatever else in the second one like drink battery acid or drive drunk.

    • @kromem
      link
      English
      49 months ago

      Yes. You are correct. This was a feature Bing added to match Google with its OneBox answers and isn’t using a LLM, but likely search matching.

      Bing shows the LLM response in the chat window.