Jake Moffatt was booking a flight to Toronto and asked the bot about the airline’s bereavement rates – reduced fares provided in the event someone needs to travel due to the death of an immediate family member.

Moffatt said he was told that these fares could be claimed retroactively by completing a refund application within 90 days of the date the ticket was issued, and submitted a screenshot of his conversation with the bot as evidence supporting this claim.

The airline refused the refund because it said its policy was that bereavement fare could not, in fact, be claimed retroactively.

Air Canada argued that it could not be held liable for information provided by the bot.

  • @[email protected]
    link
    fedilink
    10510 months ago

    It’s your fucking system. You’re liable for what it says.

    I got a bereavement care when my father died the night before my flight to see him. The phone agent did this without my asking for it. Humans good, chatbots bad.

    • Kevin
      link
      fedilink
      1910 months ago

      Oh man, moments like this when my faith in humanity is restored. I am sorry for your loss

    • @Nouveau_Burnswick
      link
      510 months ago

      I got offered a bereavement fare that was higher than the Google flights posted fare, and limited to Mon-Thurs flights.

  • @[email protected]
    link
    fedilink
    7010 months ago

    Air Canada probably spent more trying to fight this claim rather than just issuing payment when the chatbot logs were sent in

    • @[email protected]OP
      link
      fedilink
      5810 months ago

      I wonder how anyone in their right mind would propose the defense “we can’t be held liable for what the chatbot we purposefully put on our website said”. Did Air Canada’s lawyers truly think this would fly?

      If you don’t want to be held to AI hallucinations, don’t put an AI chatbot on your website, seems easy enough.

      • @[email protected]
        link
        fedilink
        English
        1810 months ago

        My organization won’t even allow auto translation widgets on our site. Instead, we refer people to using web translation services on their own, with clear language that says we’re not liable for third party mistranslations. (In multiple languages, by a company that has signed an indemnity agreement with us if their translation becomes an issue.)

        It’s a bit heavy-handed, but the lawyers hold more sway than the communications folks, and I don’t disagree with the approach – you don’t want users misunderstanding what your site says, and being able to blame you for it.

      • Drusas
        link
        fedilink
        1410 months ago

        Probably not, but they’re paid to try their best.

    • TheHarpyEagle
      link
      1010 months ago

      Surely they’re scared of more people realizing that saving these chats is important. How else will they get away with scummy practices?

    • Drusas
      link
      fedilink
      910 months ago

      I am completely certain that’s the case. For them, this is more about precedent.

  • Zellith
    link
    fedilink
    58
    edit-2
    10 months ago

    Air Canada argued that it could not be held liable for information provided by the bot

    Lol. Of course they’d say that. Perhaps hire people? Or would they also argue they couldn’t be held liable for their mistakes and misinformation?

    • @[email protected]
      link
      fedilink
      1610 months ago

      They’re trying to cheap out on real human support personnel - Chat bots are clearly not a suitable replacement.

      Fuck’em.

    • @SlopppyEngineer
      link
      1610 months ago

      If they get the precedent they are not responsible for what the AI chat bot says, then this goes for any chat box on any site and they all become worthless. Any chat bot gets a disclaimer basically saying “this thing is a dirty lier and nothing it says matters.” People will start to call human customer service to confirm what the chat bot said and the savings in employee costs are gone.

      Seems a bad long term strategy.

      • @[email protected]
        link
        fedilink
        1010 months ago

        Seems a bad long term strategy.

        It’s not a long term strategy. The person who made this decision is thinking about their quarterly or yearly bonus. By the time the problems hit, they’ve long since cashed out.

      • @[email protected]
        link
        fedilink
        710 months ago

        CS will be a multi modal chatbot too, just with a voice. I don’t think they want any human support at all. To a business, the only reason overhead exists is to cut it, and support has always been overhead.

      • @[email protected]
        link
        fedilink
        English
        610 months ago

        Way, way fewer people will call CS than will just ignore the warning.

        Once we become acclimated to things like this, we stop complaining, and let the greedy fuckers win.

  • @Skullgrid
    link
    5010 months ago

    Air Canada argued that it could not be held liable for information provided by the bot.

    the (probably legally required) system we set up just straight up lied, not our fault.

        • @Skullgrid
          link
          610 months ago

          https://lemmy.world/comment/7546839

          I am assuming the customer should legally have a way to contact a company.

          Companies try to make this obligation cost less and less by using automation and self service.

          Source : worked on the customer service platform for a fortune 500 company.

      • @Skullgrid
        link
        810 months ago

        I am assuming the customer should legally have a way to contact a company.

        Companies try to make this obligation cost less and less by using automation and self service.

        Source : worked on the customer service platform for a fortune 500 company.

  • @SamuelRJankis
    link
    English
    4210 months ago

    It’s amazing that a 7 billion dollar company goes to court to fight someone for $800. Aside from obviously being in the wrong.

    …awarding $650.88 in damages for negligent misrepresentation.

    $36.14 in pre-judgment interest and $125 in fees

    • @[email protected]
      link
      fedilink
      40
      edit-2
      10 months ago

      They’re not fighting for the $800. They’re fighting for the right to continue to use their shitty chatbot to reduce their support staff costs while not being liable for any bullshit it tells people.

      There will be cases like this in every jurisdiction.

      • @[email protected]
        link
        fedilink
        810 months ago

        Exactly.

        If the court hand found any other way, then any time the chatbot makes a mistake, they just wash their hands of it and let the consumer takes the hit.

        This means they are responsible for what the chatbot says, and is at least moderately sane.

        • @[email protected]
          link
          fedilink
          410 months ago

          If the court hand found any other way, then any time the chatbot makes a mistake, they just wash their hands of it and let the consumer takes the hit.

          It would have been just a matter of time the chatbot started making “mistakes” that financially benefitted the company more and more.

          This means they are responsible for what the chatbot says, and is at least moderately sane.

          Does this decision carry any precedent? It was a tribunal, not a court.

    • @[email protected]
      link
      fedilink
      1610 months ago

      Nothing to do with the money and everything to do with the precedent. Glad it didn’t work out for them.

    • @jasep
      link
      20
      edit-2
      10 months ago

      Chatbot = basically free

      Yearly employee = probably about $100k incl benefits

      Should they hire a person? Absolutely. Will they if they can get away with it? Aww hell naw.

      • RBG
        link
        fedilink
        710 months ago

        You are pretty optimistic they would pay $100k for this job. It is probs far less, which makes this even worse, they are not saving that much money really.

        • @[email protected]
          link
          fedilink
          410 months ago

          It costs your employer about 30% more to employ you than what you earn (approximately) - so hiring someone for 75k will usually cost a company somewhere around 100k.

        • @jasep
          link
          310 months ago

          I didn’t say they would pay $100k. I said it would cost probably $100k including benefits - a full time employee cost isn’t just their salary or hourly wage. There’s lots of overhead cost to employ a person at a large company. Also keep in mind this is for a Canadian, so don’t be thinking in USD. In CAD in the Toronto area (for example), it isn’t unreasonable to think even a first line phone based customer service agent salary would be between $65-70k, then the employee expenses on top of that.

            • @jasep
              link
              010 months ago

              Well, you’re just wrong about that. I worked for a company that employed front of the line customer service in Toronto that paid $80k…in 2017! I’m not saying Air Canada does (I have no idea what AC wages are), but if their agents are part of a union, it’s definitely possible.

              But my point stands - if anyone thinks these companies are going to ‘do the right thing’ and hire real people when these AI chatbots exist and are so cheap, it’s just not going to happen.

        • Deceptichum
          link
          fedilink
          3
          edit-2
          10 months ago

          It’s probably half that, but a chatbot can serve thousands of users whereas an employee can manage a few at a time.

          • @[email protected]
            link
            fedilink
            510 months ago

            Confirmed. As someone who has led customer operations at large companies, the scale of chatbots to address a userbase is absurd. Companies are more than willing to take the hit to their reputation and customer goodwill in exchange for not needing to hire as much staff, train them, manage their schedules or deal with benefits and performance reviews. Cutting all that cost is an instaboner to execs and a nightmare to support managers who actually care about quality.

            The amount of $700 judgements that Air Canada would need to be hit with to make replacing humans with chatbots a losing proposition is too high. It’ll never happen.

            Sadly, in my decade of experience, I’ve yet to see any bots able to reliably handle much beyond ‘where’s my order?’.

        • Drusas
          link
          fedilink
          510 months ago

          Doesn’t matter to their shareholders. They’ll have already made their money.

    • @ikidd
      link
      English
      510 months ago

      And they’ll cancel a flight on the slightest pretext. I’ve gone back to the hotel and waited 3 days while I watch from the room window flight after flight of other carriers taking off.