Company claimed its chatbot ‘was responsible for its own actions’ when giving wrong information about bereavement fare

Canada’s largest airline has been ordered to pay compensation after its chatbot gave a customer inaccurate information, misleading him into buying a full-price ticket.

Air Canada came under further criticism for later attempting to distance itself from the error by claiming that the bot was “responsible for its own actions”.

Amid a broader push by companies to automate services, the case – the first of its kind in Canada – raises questions about the level of oversight companies have over the chat tools.

  • @inb4_FoundTheVegan
    link
    English
    48 months ago

    Good. Let’s call it bug testing as we see what sort of deals various chatbots will agree too!