• @UnderpantsWeevil
    link
    English
    -11 month ago

    In another thread, I was curious about the probability of reaching the age of 60 while living in the US.

    Google gave me an assortment of links to people asking similar questions on Quora, and to some generic actuarial data, and to some totally unrelated bullshit.

    ChatGPT gave me a multi-paragraph response referencing its data sources and providing both a general life expectancy and a specific answer broken out by gender. I asked ChatGPT how it reached this answer, and it proceeded to show its work. If I wanted to verify the work myself, ChatGPT gave me source material to cross-check and the calculations it used to find the answer. Google didn’t even come close to answering the question, much less producing the data it used to reach the answer.

    I’m as big an AI skeptic as anyone, but it can’t be denied that generic search engines have degraded significantly. I feel like I’m using Alta Vista in the 90s whenever I query Google in the modern day. The AI systems do a marginally better job than old search engines were doing five years ago, before enshittification hit with full force.

    It sucks that AI is better, but it IS better.

    • @[email protected]
      link
      fedilink
      51 month ago

      referencing its data sources

      Have you actually checked whether those sources exist yourself? It’s been quite a while since I’ve used GPT, and I would be positively surprised if they’ve managed to prevent its generation of nonexistent citations.

      • @UnderpantsWeevil
        link
        English
        01 month ago

        Have you actually checked whether those sources exist yourself

        When I’m curious enough, yes. While you can find plenty of “AI lied to me” examples online, they’re much harder to fish for in the application itself.

        99 times out of 100, the references are good. But those cases aren’t fun to dunk on.