I suspect that this is the direct result of AI generated content just overwhelming any real content.

I tried ddg, google, bing, quant, and none of them really help me find information I want these days.

Perplexity seems to work but I don’t like the idea of AI giving me “facts” since they are mostly based on other AI posts

ETA: someone suggested SearXNG and after using it a bit it seems to be much better compared to ddg and the rest.

  • @[email protected]
    link
    fedilink
    English
    -54 months ago

    I don’t honestly even remember the last time I’ve googled something. Nowdays I’ll just ask chatGPT

    • @[email protected]
      link
      fedilink
      English
      84 months ago

      The problem with getting answers from AI is that if they don’t know something, they’ll just make it up.

      • @OlinOfTheHillPeople
        link
        English
        44 months ago

        “If I have to create stories so that the American media actually pays attention to the suffering of the American people, then that’s what I’m going to do.”

        • VanceGPT
      • @Entropywins
        link
        English
        44 months ago

        Sounds an awful lot like some coworkers

      • @[email protected]
        link
        fedilink
        English
        -54 months ago

        LLMs have their flaws but for my use it’s usually good enough. It’s rarely mission critical information that I’m looking for. It satisfies my thirst for an answer and even if it’s wrong I’m probably going to forget it in a few hours anyway. If it’s something important I’ll start with chatGPT and then fact check it by looking up the information myself.

        • @[email protected]
          link
          fedilink
          English
          74 months ago

          So, let me get this straight…you “thirst for an answer”, but you don’t care whether or not the answer is correct?

          • @[email protected]
            link
            fedilink
            English
            04 months ago

            Of course I care whether the answer is correct. My point was that even when it’s not, it doesn’t really matter much because if it were critical, I wouldn’t be asking ChatGPT in the first place. More often than not, the answer it gives me is correct. The occasional hallucination is a price I’m willing to pay for the huge convenience of having something like ChatGPT to quickly bounce ideas off of and ask about stuff.

            • @[email protected]
              link
              fedilink
              English
              24 months ago

              I agree that AI can be helpful for bouncing ideas off of. It’s been a great aid in learning, too. However, when I’m using it to help me learn programming, for example, I can run the code and see whether or not it works.

              I’m automatically skeptical of anything they tell me, because I know they could just be making something up. I always have to verify.