• @RGB3x3
      link
      English
      29
      edit-2
      7 months ago

      I just tried to have Gemini navigate to the nearest Starbucks and the POS found one 8hrs and 38mins away.

      Absolute trash.

      • @RGB3x3
        link
        English
        187 months ago

        Just tried it with Target and again, it’s sending me to Raleigh, North Carolina.

        • @captainlezbian
          link
          127 months ago

          It seems to think you need to leave Alabama but aren’t ready for a state as tolerable as Georgia

          • @RGB3x3
            link
            English
            77 months ago

            I would totally leave if the “salary to cost of living” ratio wasn’t so damn good.

            I’d move to Germany or the Netherlands or Sweden or Norway so fast if I could afford it.

        • @[email protected]
          link
          fedilink
          English
          57 months ago

          that leads me to believe it thinks you are in North Carolina. have you allowed location to Gemini? Are you on a VPN?

          • @RGB3x3
            link
            English
            57 months ago

            No VPN, it all has proper location access. I even tried it with a local restaurant that I didn’t think was a chain, and it found one in Tennessee. I’m like 10 minutes away from where I told it to go.

    • IndiBrony
      link
      English
      207 months ago

      Despite that, it delivers its results with much applum!

  • @[email protected]
    link
    fedilink
    English
    637 months ago

    Some “AI” LLMs resort to light hallucinations. And then ones like this straight-up gaslight you!

    • @eatCasserole
      link
      507 months ago

      Factual accuracy in LLMs is “an area of active research”, i.e. they haven’t the foggiest how to make them stop spouting nonsense.

      • @[email protected]
        link
        fedilink
        287 months ago

        duckduckgo figured this out quite a while ago: just fucking summarize wikipedia articles and link to the precise section it lifted text from

      • @[email protected]
        link
        fedilink
        English
        12
        edit-2
        7 months ago

        Because accuracy requires that you make a reasonable distinction between truth and fiction, and that requires context, meaning, understanding. Hell, full humans aren’t that great at this task. This isn’t a small problem, I don’t think you solve it without creating AGI.

    • @Jimmyeatsausage
      link
      77 months ago

      MFer accidentally got “plum” right and didn’t even know it…

  • Margot Robbie
    link
    fedilink
    397 months ago

    Ok, let me try listing words that ends in “um” that could be (even tangentially) considered food.

    • Plum
    • Gum
    • Chum
    • Rum
    • Alum
    • Rum, again
    • Sea People

    I think that’s all of them.

    • @[email protected]
      link
      fedilink
      67 months ago

      The Sea Peoples consumed by the Late Bronze Age collapse (or were a catalysts thereof)?

      Or just people at sea eaten by krakens? Cause they definitely count.

    • M137
      link
      2
      edit-2
      7 months ago

      Bum
      Cryptosporidium
      Cum
      Opium
      Possum
      Scrotum
      Scum

      All very Yum for the Tum!

  • @paddirn
    link
    English
    30
    edit-2
    7 months ago

    And yet it doesn’t even list ‘Plum’, or did it think ‘Applum’ was just a variation of a plum?

    • @TexasDrunk
      link
      97 months ago

      Well, plum originally comes from applum which morphed into a plum so yeah.

      And that’s absolutely not true.

        • @Couldbealeotard
          link
          English
          3
          edit-2
          7 months ago

          Did you know that “factoid” means small piece of trivia? /S

          • @TexasDrunk
            link
            17 months ago

            A lot of folks on the internet don’t get even the most obvious jokes without some sort of sarcasm indicator because some things are really hard to read in text vs in person. LLMs have no idea what the hell sarcasm is and definitely include some in their training, especially if they were trained on any of my old Reddit comments.

  • @chuckleslord
    link
    297 months ago

    Totally reproducible, just with slightly different prompts.

    • @[email protected]
      link
      fedilink
      277 months ago

      There’s going to be an entire generation of people growing up with this and “learning” this way. It’s like every tech company got together and agreed to kill any chance of smart kids.

      • @[email protected]
        link
        fedilink
        117 months ago

        Isn’t it the opposite? Kids see so many examples of obviously wrong answers they learn to check everything

        • @Maalus
          link
          67 months ago

          How do they know something is obviously wrong when they try to learn it? For “bananum” sure, for anything at school, college though?

          • @[email protected]
            link
            fedilink
            17 months ago

            The bananum was my point. Maybe as ai improves there won’t be as many of these obviously wrong things, but as it stands virtually any google search gets a shitty wrong answer from ai, and so they see tons of this bad info well before college.

  • shininghero
    link
    fedilink
    177 months ago

    Strawberrum sounds like it’ll be at least 20% abv. I’d like a nice cold glass of that.

  • @moistclump
    link
    127 months ago

    Applum bananum jeans, boots with the fur.

  • Sips'
    link
    fedilink
    117 months ago

    It’s crazy how bad d AI gets of you make it list names ending with a certain pattern. I wonder why that is.

    • @bisby
      link
      English
      127 months ago

      I’m not an expert, but it has something to do with full words vs partial words. It also can’t play wordle because it doesn’t have a proper concept of individual letters in that way, its trained to only handle full words

      • @[email protected]
        link
        fedilink
        37 months ago

        they don’t even handle full words, it’s just arbitrary groups of characters (including space and other stuff like apostrophe afaik) that is represented to the software as indexes on a list, it literally has no clue what language even is, it’s a glorified calculator that happens to work on words.

        • @SpacetimeMachine
          link
          17 months ago

          I mean, isn’t any program essentially a glorified calculator?

          • @[email protected]
            link
            fedilink
            17 months ago

            not really, a basic calculator doesn’t tend to have variables and stuff like that

            i say it’s a glorified calculator because it’s just getting input in the form of numbers (again, it has no clue what a language or word is) and spitting back out some numbers that are then reconstructed into words, which is precisely how we use calculators.

    • @[email protected]
      link
      fedilink
      English
      47 months ago

      It can’t see what tokens it puts out, you would need additional passes on the output for it to get it right. It’s computationally expensive, so I’m pretty sure that didn’t happen here.

      • @Jesusaurus
        link
        English
        17 months ago

        With the amount of processing it takes to generate the output, a simple pass over the to-be final output would make sense…

      • @[email protected]
        link
        fedilink
        17 months ago

        doesn’t it work literally by passing in everything it said to determine what the next word is?

  • @riodoro1
    link
    87 months ago

    AI is truly going to change the world.

  • @RizzRustbolt
    link
    67 months ago

    Looks like someone set Google to “Herakles Mode”.

  • @[email protected]
    link
    fedilink
    47 months ago

    Ok, I feel like there has been more than enough articles to explain that these things don’t understand logic. Seriously. Misunderstanding their capabilities at this point is getting old. It’s time to start making stupid painful.