Air Canada appears to have quietly killed its costly chatbot support.

  • @[email protected]
    link
    fedilink
    English
    9210 months ago

    What i find most stupid about all of this is that Air Canada could just have admitted a mistake, payed The refund of ~450 USD which is basically nothing to them. It would have waisted no one’s time and made good customer service and positive feedback. Then quietly fix the AI in the background and move on. Instead they now spend waaayy more money on legale fees, expensive lawyers, employees sallery, have a disabled AI, customer backlash and bad press all costing them many hundreds of thousands of dollars. So stupid.

    • threelonmusketeers
      link
      fedilink
      English
      5210 months ago

      payed

      Paid. Something something “payed” is only for nautical rope or something.

      waisted

      Wasted. Something something “waisted” is only for dressmaking or something.

      I can’t remember the details of what that bot says, but it is something along these lines. I am not a bot, and this action was performed manually. Cheers!

      • @[email protected]
        link
        fedilink
        English
        2610 months ago

        Thanks. I do know tho, but im slightly dyslexic and English is not my first language so it’s hard for me to catch my own mistakes, while I can easily see it when others are making it. Also autocorrect is a blessing and a curse for me sometimes.

        • arefx
          link
          fedilink
          English
          17
          edit-2
          10 months ago

          Even best selling authors make these mistakes, most people don’t have an editor proof reading their off the cuff reddit/lemmy comments.

          • @[email protected]
            link
            fedilink
            English
            110 months ago

            I think it’s crazy that your comment is true right now, but we are also just on the cusp where it would be 100% possible to have every one of your Lemmy comments proofread and edited by a LLM “editor”.

    • @[email protected]
      link
      fedilink
      English
      1010 months ago

      Test case.

      Like whoever wrote the underlying bot (chatgpt?) Doesn’t want a precedent saying bot is liable, so they will invest huge resources into this one case.

      They probably settled thousands of cases waiting for this one to come up, thinking this one had the right characteristics.

      • @[email protected]
        link
        fedilink
        English
        710 months ago

        You’d think they’d have tried a better case then. They lost in the court of public opinion as soon as it was about bereavement and their argument that the chatbot on their own site is a separate legal entity they aren’t responsible for is pants on head stupid.

        In a way, we should be grateful they bungled it and are held liable, other companies may be held to the same standard in the future.