• @[email protected]
    link
    fedilink
    English
    373 months ago

    I think this is a very good and funny example of how ai makes stuff look right (at first glance at least) but lacks any type of sense about what stuff means

    • P03 Locke
      link
      fedilink
      English
      23 months ago

      Honestly, if it could get this sort of thing right, it would already have enough cognitive function for us to be scared of self-awareness.

      Following or designing step-by-step instructions requires a lot of intelligence.

  • @breadsmasher
    link
    English
    203 months ago

    Im a bit stuck on step 3, can anyone help?

  • @[email protected]
    link
    fedilink
    English
    133 months ago

    I don’t know why but I chuckled at this more than anything else I’ve seen on the internet today. Maybe just laughing at my dumb brain trying to work it out before seeing the text and realising it was AI.

  • AsakuraMao
    link
    fedilink
    83 months ago

    Instructions unclear, penis now trapped in peanut butter jar

  • @cmhe
    link
    English
    6
    edit-2
    3 months ago

    A lot of 3s and no 7. Does AI has a bias on what numbers they create?

    Like if I generate 1000 pictures with a number between 0 and 9, are those numbers distributed equally or what would the distribution look like?

    Humans, when ask to say random numbers also have biases in some circumstances, so I guess AI does too.

    • @hinterlufer
      link
      English
      53 months ago

      When I asked gemini to randomly arrange the numbers between 4 and 27, it spit out a seemingly correct list of numbers with the issue that 23 was randomly missing

    • Iapar
      link
      fedilink
      English
      53 months ago

      Jupp, as ai is trained on humans it will inherit our biases. That’s one of the biggest problems to solve. “If we train our ai on 4chan posts, how do we make it not racist\sexist\etc.”

    • @jacksilver
      link
      English
      23 months ago

      LLM based technology has been shown to have biases in randomness, there was an article a while back experimenting with coin flips showing a lack of true randomness.

      Its cause it’s about token prediction, so their is forced priority behind the scenes whether or not that is visible to the user.

      Its the same reason why when you ask an image generator to create “a person from India” you get a man in a turbin a majority of the time.

    • @[email protected]
      link
      fedilink
      English
      13 months ago

      4 also appears 3 times, but that number 3 isn’t always a number 3 - especially the bottom right kinda looks like a negative 3, and the one left of it…

  • @[email protected]
    link
    fedilink
    English
    03 months ago

    How could an AI know you shouldn’t combine PB with J? unless the J is homemade preserves. Did anyone tell the AI about homemade fucking preserves? Garbage in, garbage out. Ya’ll!