172
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
this post was submitted on 24 Oct 2023
172 points (92.2% liked)
Technology
59080 readers
3487 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
The issue isn't the approach, it's the accuracy. AI are statistical models. They're not designed to give right answers. They're designed to give believable answers, which area occasionally correct.
So who knows what these kids are learning. It could be ridiculous inaccuracies like Columbus peacefully discovering America.
Or that the Civil War was fought over states rights. To do what, AI? The right to do what?
The same could really be said about human teachers as well, though. An AI is frequently confidently wrong but so was my history teacher.
Don't get me wrong, I think this is a terrible idea. But we were already vulnerable to misinformation with classical schooling. To use your example, we WERE taught that Colombus discovered the Americas peacefully. It wasn't until I reached college that I learned the truth behind the discovery and colonization of the Americas, and I only even learned it then by doing my own history reading. Up until that point I had been taught that Thanksgiving was celebrated in memory of the happy-fun-get-along-times that were had between the settlers and the natives.
Kids are already taught ridiculous inaccuracies on purpose and while I hardly think an idea like this would improve that situation, I have to point out that at least accidental misinformation would be less objectively evil than what we already misinform kids about.
Your history teacher is sometimes confidently wrong because they are subject to the biases of their time and culture. AI is sometimes confidently wrong because it literally is incapable of evaluating information to assess its factuality. I know which one I think should be in charge of teaching children.