380
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
this post was submitted on 26 Jun 2024
380 points (97.3% liked)
Technology
59583 readers
2540 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
Those mistakes would be easily solved by something that doesn't even need to think. Just add a filter of acceptable orders, or hire a low wage human who does not give a shit about the customers special orders.
In general, AI really needs to set some boundaries. "No" is a perfectly good answer, but it doesn't ever do that, does it?
That wouldn't address the bulk of the issue, only the most egregious examples of it.
For every funny output like "I asked for 1 ice cream, it's giving me 200 burgers", there's likely tens, hundreds, thousands of outputs like "I asked for 1 ice cream, it's giving 1 burger", that sound sensible but are still the same problem.
It's simply the wrong tool for the job. Using LLMs here is like hammering screws, or screwdriving nails. LLMs are a decent tool for things that you can supervision (not the case here), or where a large amount of false positives+negatives is not a big deal (not the case here either).
sure it does. it won’t tell you how to build a bomb or demonstrate explicit biases that have been fine tuned out of it. the problem is McDonald’s isn’t an AI company and probably is just using ChatGPT on the backend, and GPT doesn’t give a shit about bacon ice cream out of the box.
They really should have used a genetic algorithm to optimise their menu items for maximum ~~customer satisfaction~~ profits instead of using an LLM!
The execs do know other algorithms than LLMs exist right?
EDIT: prob replied to wrong thread
So, what happens if you order a bomb at the McD?
You get bacon on ice cream.