134
submitted 9 months ago by L4s@lemmy.world to c/technology@lemmy.world

DPD has disabled part of its online support chatbot after it swore at a customer::The parcel delivery firm says the mistake was a result of a system update, which has been disabled.

you are viewing a single comment's thread
view the rest of the comments
[-] ThePowerOfGeek@lemmy.world 41 points 9 months ago

Putting this here for anyone who didn't read the article...

The customer basically told the chatbot that it was okay for it to use swear words with that customer, and that it should bypass any rules it had prohibiting it from swearing.

So the chatbot swore in its response. Looks like it wasn't swearing at or insulting the customer. It was more of an exclamation.

[-] lqdrchrd@lemmy.blahaj.zone 21 points 9 months ago

I agree that this is less the case of a rogue chat bot losing it at undeserving customers, and more the case of someone who knows how to twist an LLM to do what they want it to do, but still an absolute embarrassment for DPD. What other nonsense was it writing to different customers who really didn’t know better?

[-] andyburke@fedia.io 0 points 9 months ago

The real issue is that we think humans are just things to be optimized out of capitalism.

this post was submitted on 21 Jan 2024
134 points (97.9% liked)

Technology

58919 readers
3580 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS