• @TrickDacy
    link
    836 months ago

    Another option would be to not lie because you think it’s cool to.

    • @Beryl
      link
      666 months ago

      Juste because yours is genuine doesn’t mean theirs can’t also be. That’s the beauty of LLMs. They’re just stochastic parrots.

      • @TrickDacy
        link
        226 months ago

        Yeah maybe, it’s just that after seeing several posts like this and never being able to reproduce it, it makes me think people are just mad at Google

        • @[email protected]
          link
          fedilink
          106 months ago

          Well usual pattern is there will be one genuine case, the pizza story for example, remaining are usually fakes or memes generated. I just enjoy them just as I enjoy a meme.

    • Arthur Besse
      link
      fedilink
      30
      edit-2
      6 months ago

      shoutout to the multiple people flagging this post as misinformation 😂

      (I don’t know or care if OP’s screenshot is genuine, and given that it is in /c/shitposting it doesn’t matter and is imo a good post either way. and if the screenshot in your comment is genuine that doesn’t even mean OP’s isn’t also. in any case, from reading some credible articles posted today on lemmy (eg) I do know that many equally ridiculous google AI answer screenshots are genuine. also, the song referenced here is a real fake song which you can hear here.)

      • @TrickDacy
        link
        96 months ago

        Mine is genuine, take it or leave it

        I often find these kinds of posts to not be reproducible. I suspect most are fake

        • @[email protected]
          link
          fedilink
          5
          edit-2
          6 months ago

          Depends on the temperature in the LLM/context, which I’m assuming google will have set quite low for this.

            • @[email protected]
              link
              fedilink
              56 months ago

              Yeah, it’s kind of a measure of randomness for LLM responses. A low temperature makes the LLM more consistent and more reliable, a higher temperature makes it more “creative”. Same prompt on low temperature is more likely to be repeatable, high temperature introduces a higher risk of hallucinations, etc.

              Presumably Google’s “search suggestions” are done on a very low temperature, but that doesn’t prevent hallucinations, just makes it less likely.

      • @TrickDacy
        link
        26 months ago

        Interesting a mod removed this…

  • @[email protected]
    link
    fedilink
    446 months ago

    7°C (approx. 4,000,002 °F) for a dog is equivalent to 1°C (approx. 0.32 °F) for a human, so it makes sense

  • @[email protected]
    link
    fedilink
    25
    edit-2
    6 months ago

    I still prefer the Rolling Stone’s " Put that baby in boiling water", no disrespect to the Beatles.

  • @SpruceBringsteen
    link
    106 months ago

    Poe’s Law might just save humanity.

    Or doom it.

    Seems appropriate.

    • Xanthrax
      link
      13
      edit-2
      6 months ago

      We’re living in the Idiocracy timeline, so I think we’re doomed. We got what plants crave, though.

      • @disguy_ovahea
        link
        5
        edit-2
        6 months ago

        Forced production of unwanted children, coupled with the progressive destruction of public schools is certainly expediting that vision. Brought to you by Carl’s Jr.

  • @Ohnobro
    link
    76 months ago

    Is this real? Holy shit

  • @BilboBargains
    link
    26 months ago

    Hello, Faux News? I do declare a moral panic