I was looking for info about the upcoming event. I got a guess about what might happen in the future. Cannot describe how useless this search result was.

Just changed my default search engine to DuckDuckGo. Can’t be worse than this.

  • @Rhoeri
    link
    English
    451 month ago

    Hold up… are there actually people that expect it to be a reliable source of anything?

    • @[email protected]
      link
      fedilink
      English
      47
      edit-2
      1 month ago

      Of course they do. AI is aggressively marketed as such. Most people simply don’t know that an LLM doesn’t have a concept of "truth“, and the misleading marketing is to blame for that.

      • AFK BRB Chocolate
        link
        English
        71 month ago

        I always try to explain to people that the key is the last two letters: language model. An LLM is a model of what a conversation should look like. Ask it a question and it’s intended to give you a response that looks like the right kind of thing. So if you ask it for a mathematical proof, it will give you one, but unless the thing you’re asking has the same proof written the same way in lots of places online, what it gives you won’t be correct, and probably won’t actually make sense mathematically, but it will look like the right kind of thing.

        So likewise, if you ask it for relationship advice, it’s going to give you something that looks legit, but you’re an idiot if you get your relationship advice from an LLM.

      • VindictiveJudge
        link
        English
        61 month ago

        Seriously, it’s just a fancy auto-complete. It knows nothing.

    • @RedditWanderer
      link
      English
      61 month ago

      It’s supposed to be a reliable indicator of the most common chain of words that follow your chain of words.

      There are enough chains of words on the internet to do impressive stuff. The problem is when you give it dumb chains of words or chains it made itself.