83

A chatbot used by Air Canada hallucinated an inaccurate bereavement discount policy, and a customer who flew based on this information sued the company over being misled. The small claims court sided with the deceived customer, arguing that the chatbot was acting as an official agent of Air Canada, and that there was no reason a customer should have to double check resources from one part of the Air Canada website against different parts of the same website.

top 29 comments
sorted by: hot top controversial new old
[-] Infamousblt@hexbear.net 48 points 9 months ago

This was inevitable and will instantly kill AI chatbots. I tried to explain this to my marketing team when they were all excited about an AI chatbot on our company website. Wonder if this will change their tune

[-] viva_la_juche@hexbear.net 33 points 9 months ago

will instantly kill AI chatbots

inshallah

[-] D61@hexbear.net 10 points 9 months ago
[-] 420blazeit69@hexbear.net 8 points 9 months ago

This case may have cost the airline a few grand. Sure you'll get a few losses like this (but many more situations where the customer just eats it), but if the cost savings of the chatbot are enough...

[-] carpoftruth@hexbear.net 42 points 9 months ago

According to Air Canada, Moffatt never should have trusted the chatbot and the airline should not be liable for the chatbot's misleading information because Air Canada essentially argued that "the chatbot is a separate legal entity that is responsible for its own actions," a court order said.

Prepare for more of that, applied to weaponized drones

[-] LeopardShepherd@hexbear.net 23 points 9 months ago

Oh that wedding? The drone just did that sorry

[-] carpoftruth@hexbear.net 15 points 9 months ago

Actually the human AI helper making $1/h has legal responsility for that strike

[-] Bloobish@hexbear.net 4 points 9 months ago

Is this how we also give AiChatbots personhood by accident?

[-] Frank@hexbear.net 37 points 9 months ago

It's not a "hallucination" you dorks it's a random number generator that you used to replace labor.

[-] sexywheat@hexbear.net 10 points 9 months ago

Yeah why is it a "hallucination" when the AI just makes shit up, but when a person does it they're either lying or just plain wrong.

[-] Frank@hexbear.net 10 points 9 months ago

I didn't lie on my tax returns it was a hallucination due to poorly curated traing matieral.

[-] SerLava@hexbear.net 5 points 9 months ago

Because the person knows and the AI is dumb as dirt

[-] SerLava@hexbear.net 7 points 9 months ago

I like to call it a hallucination, because while yes the thing isn't smart enough to experience thoughts, it really gets at how absolutely unreliable these things are. People are talking to the thing and taking it seriously, and it's just watching the pink dragons circle around

[-] Beaver@hexbear.net 3 points 9 months ago

Idea: how about we replace all our typists with a bunch of monkeys? They'll eventually type the right thing!

[-] PoY@lemmygrad.ml 35 points 9 months ago

air canada is a bunch of fucking thieves... my partner's parents had a flight cancelled and then rebooked, and then cancelled, and then rebooked again 2 days later... Air Canada tried to send them $300 as compensation when the laws clearly state if its over 9 hours delayed then they owe $1000. When her parents refuted the $300 compensation they said it was too late, they already sent the $300. Now they have to go petition some air board something or other. On top of that when they rebooked the flight finally, they didn't add the extra bag from the original ticket and wouldn't let her dad board the plane, so my partner called the airline and gave them her credit card to add the goddamned bag again. Then when all was said and done they had charged both of them for the same bag. When my partner called them to dispute this, they said they only see 1 charge for the bag and tough luck.

Fucking scam artists.

[-] sexywheat@hexbear.net 10 points 9 months ago

IIRC they were rated the worst airline in all of North America a few years back, and totally well deserved.

What privatisation of state industries/services does to a mf.

[-] PKMKII@hexbear.net 34 points 9 months ago

I remember a state recently passed a law barring lawyers from using AI programs to generate legal documents, and this right here is why. Remove the possibility of lawyers appealing to “well it’s not our fault the document is wrong, the AI did it!”

[-] OutrageousHairdo@hexbear.net 27 points 9 months ago

I heard about someone doing that from Leonard French. Some old boomer thought the AI could actually search for court cases, ended up getting tricked into citing a bunch of non-existent caselaw and got into a lot of trouble.

[-] D61@hexbear.net 10 points 9 months ago

I'm wishing so hard it was a Sovereign Citizen...

[-] RyanGosling@hexbear.net 9 points 9 months ago

The OG chatbot

[-] DamarcusArt@lemmygrad.ml 21 points 9 months ago

Oh please, I really hope we get more stuff like this. Nothing will kill this fad faster than companies realising they've been swindled by techbros threatening them with FOMO and this algorithm bullshit won't actually do anything useful for them.

[-] ProletarianDictator@hexbear.net 7 points 9 months ago

They'll kill the ability to sue for damages before sacrificing the cash cow

[-] M500@lemmy.ml 4 points 9 months ago

Then I’m not going to talk to them. If the information they give me may be incorrect and not binding, then what’s the point?

[-] ProletarianDictator@hexbear.net 1 points 8 months ago

then what’s the point?

wooing investors who bust a nut at the the idea of inserting LLMs into shit that doesn't need them.

[-] NephewAlphaBravo@hexbear.net 17 points 9 months ago

I extremely approve of describing everything AI does as hallucinations, dreaming, etc

[-] SerLava@hexbear.net 9 points 9 months ago

HAHAHAHAHAH fucking amazing

[-] TheSpectreOfGay@hexbear.net 6 points 9 months ago
[-] PoY@lemmygrad.ml 5 points 9 months ago* (last edited 9 months ago)
[-] Bloobish@hexbear.net 3 points 9 months ago
this post was submitted on 17 Feb 2024
83 points (100.0% liked)

news

23555 readers
774 users here now

Welcome to c/news! Please read the Hexbear Code of Conduct and remember... we're all comrades here.

Rules:

-- PLEASE KEEP POST TITLES INFORMATIVE --

-- Overly editorialized titles, particularly if they link to opinion pieces, may get your post removed. --

-- All posts must include a link to their source. Screenshots are fine IF you include the link in the post body. --

-- If you are citing a twitter post as news please include not just the twitter.com in your links but also nitter.net (or another Nitter instance). There is also a Firefox extension that can redirect Twitter links to a Nitter instance: https://addons.mozilla.org/en-US/firefox/addon/libredirect/ or archive them as you would any other reactionary source using e.g. https://archive.today . Twitter screenshots still need to be sourced or they will be removed --

-- Mass tagging comm moderators across multiple posts like a broken markov chain bot will result in a comm ban--

-- Repeated consecutive posting of reactionary sources, fake news, misleading / outdated news, false alarms over ghoul deaths, and/or shitposts will result in a comm ban.--

-- Neglecting to use content warnings or NSFW when dealing with disturbing content will be removed until in compliance. Users who are consecutively reported due to failing to use content warnings or NSFW tags when commenting on or posting disturbing content will result in the user being banned. --

-- Using April 1st as an excuse to post fake headlines, like the resurrection of Kissinger while he is still fortunately dead, will result in the poster being thrown in the gamer gulag and be sentenced to play and beat trashy mobile games like 'Raid: Shadow Legends' in order to be rehabilitated back into general society. --

founded 4 years ago
MODERATORS