• RedFox@infosec.pub
    link
    fedilink
    English
    arrow-up
    41
    ·
    8 months ago

    Sorr, but I love the double sided hypocrisy here.

    Here’s a chatbot instead of a person, listen to it since we won’t take your calls. But, we don’t honor what is says!

    Thanks Canadian court for giving us a rare middle finger to the business.

  • Vanth@reddthat.com
    link
    fedilink
    English
    arrow-up
    36
    ·
    8 months ago

    Air Canada has some really dumb lawyers. They could have quietly paid the guy a couple hundred bucks and moved on. Now they’re all over the news, showing off how callous and idiotic they are.

    • 520@kbin.social
      link
      fedilink
      arrow-up
      17
      ·
      8 months ago

      Not only that, they set a precedent that will hugely discourage the use of LLM chatbots too. Great for us humans though

    • BearOfaTime@lemm.ee
      link
      fedilink
      English
      arrow-up
      11
      arrow-down
      1
      ·
      8 months ago

      Right?

      And the customer service benefit they would’ve gotten from just eating a few hundred dollars.

      But they were being extra greedy, and thinking they could establish precedent… Well they did, just not how they wanted.

    • RedFox@infosec.pub
      link
      fedilink
      English
      arrow-up
      7
      arrow-down
      2
      ·
      edit-2
      8 months ago

      I bet they make so much money too…

      Overpaid lawyer 1: Fight this or settle?

      Overpaid lawyer 2: Let’s fight this, I have a good feeling about it…

      Overpaid lawyer 1: This won’t set a precedent or anything right…right…

  • MeatsOfRage@lemmy.world
    link
    fedilink
    English
    arrow-up
    19
    ·
    8 months ago

    According to Air Canada, Moffatt never should have trusted the chatbot and the airline should not be liable for the chatbot’s misleading information because Air Canada essentially argued that “the chatbot is a separate legal entity that is responsible for its own actions,” a court order said.

    That’s some business class horse shit right there, glad they got taken to task over this

  • jballs@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    19
    ·
    8 months ago

    Good on the guy for taking screenshots. I’m sure if he hadn’t and claimed the AI Chatbot told him something, the company would have mysteriously lost the logs.

  • AutoTL;DR@lemmings.worldB
    link
    fedilink
    English
    arrow-up
    0
    arrow-down
    2
    ·
    8 months ago

    This is the best summary I could come up with:


    On the day Jake Moffatt’s grandmother died, Moffat immediately visited Air Canada’s website to book a flight from Vancouver to Toronto.

    In reality, Air Canada’s policy explicitly stated that the airline will not provide refunds for bereavement travel after the flight is booked.

    Experts told the Vancouver Sun that Moffatt’s case appeared to be the first time a Canadian company tried to argue that it wasn’t liable for information provided by its chatbot.

    Last March, Air Canada’s chief information officer Mel Crocker told the Globe and Mail that the airline had launched the chatbot as an AI “experiment.”

    “So in the case of a snowstorm, if you have not been issued your new boarding pass yet and you just want to confirm if you have a seat available on another flight, that’s the sort of thing we can easily handle with AI,” Crocker told the Globe and Mail.

    It was worth it, Crocker said, because “the airline believes investing in automation and machine learning technology will lower its expenses” and “fundamentally” create “a better customer experience.”


    The original article contains 906 words, the summary contains 176 words. Saved 81%. I’m a bot and I’m open source!