cross-posted from: https://lemmy.ml/post/3109500

Pak ‘n’ Save’s Savey Meal-bot cheerfully created unappealing recipes when customers experimented with non-grocery household items

  • @giacomo
    link
    English
    211 year ago

    Sure, it can make a recipe for chlorine gas, but can it recommend a wine pairing to go with the gas?

    • TheSpookiestUserOP
      link
      English
      41 year ago

      Agree. “Chatbot outputs ridiculous response when given ridiculous inputs” gets old.

      This was at least funny.

      • @Buddahriffic
        link
        English
        31 year ago

        Though I would say that it spitting out recipes for things that aren’t even ingredients indicates that it’s not a useful tool. It’s not basing recipe recommendations on any knowledge of food, cooking, flavours, textures, or chemistry. Seems like it’s just arbitrarily fitting a list of ingredients into some other patterns.

        If it doesn’t understand “this isn’t a safe ingredient”, I doubt it understands anything about what ingredients that aren’t poison would go well together, other than ones it has seen paired in it’s training set.

  • @TheDoctorDonna
    link
    English
    61 year ago

    The headline makes it sound as if it was just randomly suggesting this, but of course it would do that with people inputting non-food ingredients.

  • @[email protected]
    link
    fedilink
    English
    41 year ago

    and noted that the bot has terms and conditions stating that users should be over 18.

    We should definitely prosecute kids who poison themselves or others via use of this app