• RedFox
    link
    fedilink
    English
    7310 months ago

    Sorr, but I love the double sided hypocrisy here.

    Here’s a chatbot instead of a person, listen to it since we won’t take your calls. But, we don’t honor what is says!

    Thanks Canadian court for giving us a rare middle finger to the business.

    • 520
      link
      fedilink
      3310 months ago

      Not only that, they set a precedent that will hugely discourage the use of LLM chatbots too. Great for us humans though

    • RedFox
      link
      fedilink
      English
      14
      edit-2
      10 months ago

      I bet they make so much money too…

      Overpaid lawyer 1: Fight this or settle?

      Overpaid lawyer 2: Let’s fight this, I have a good feeling about it…

      Overpaid lawyer 1: This won’t set a precedent or anything right…right…

    • @[email protected]
      link
      fedilink
      English
      1310 months ago

      Right?

      And the customer service benefit they would’ve gotten from just eating a few hundred dollars.

      But they were being extra greedy, and thinking they could establish precedent… Well they did, just not how they wanted.

  • @MeatsOfRage
    link
    English
    2710 months ago

    According to Air Canada, Moffatt never should have trusted the chatbot and the airline should not be liable for the chatbot’s misleading information because Air Canada essentially argued that “the chatbot is a separate legal entity that is responsible for its own actions,” a court order said.

    That’s some business class horse shit right there, glad they got taken to task over this

  • @[email protected]
    link
    fedilink
    English
    2510 months ago

    Good on the guy for taking screenshots. I’m sure if he hadn’t and claimed the AI Chatbot told him something, the company would have mysteriously lost the logs.

  • AutoTL;DRB
    link
    fedilink
    English
    110 months ago

    This is the best summary I could come up with:


    On the day Jake Moffatt’s grandmother died, Moffat immediately visited Air Canada’s website to book a flight from Vancouver to Toronto.

    In reality, Air Canada’s policy explicitly stated that the airline will not provide refunds for bereavement travel after the flight is booked.

    Experts told the Vancouver Sun that Moffatt’s case appeared to be the first time a Canadian company tried to argue that it wasn’t liable for information provided by its chatbot.

    Last March, Air Canada’s chief information officer Mel Crocker told the Globe and Mail that the airline had launched the chatbot as an AI “experiment.”

    “So in the case of a snowstorm, if you have not been issued your new boarding pass yet and you just want to confirm if you have a seat available on another flight, that’s the sort of thing we can easily handle with AI,” Crocker told the Globe and Mail.

    It was worth it, Crocker said, because “the airline believes investing in automation and machine learning technology will lower its expenses” and “fundamentally” create “a better customer experience.”


    The original article contains 906 words, the summary contains 176 words. Saved 81%. I’m a bot and I’m open source!