view the rest of the comments
Ask Lemmy
A Fediverse community for open-ended, thought provoking questions
Please don't post about US Politics. If you need to do this, try !politicaldiscussion@lemmy.world
Rules: (interactive)
1) Be nice and; have fun
Doxxing, trolling, sealioning, racism, and toxicity are not welcomed in AskLemmy. Remember what your mother said: if you can't say something nice, don't say anything at all. In addition, the site-wide Lemmy.world terms of service also apply here. Please familiarize yourself with them
2) All posts must end with a '?'
This is sort of like Jeopardy. Please phrase all post titles in the form of a proper question ending with ?
3) No spam
Please do not flood the community with nonsense. Actual suspected spammers will be banned on site. No astroturfing.
4) NSFW is okay, within reason
Just remember to tag posts with either a content warning or a [NSFW] tag. Overtly sexual posts are not allowed, please direct them to either !asklemmyafterdark@lemmy.world or !asklemmynsfw@lemmynsfw.com.
NSFW comments should be restricted to posts tagged [NSFW].
5) This is not a support community.
It is not a place for 'how do I?', type questions.
If you have any questions regarding the site itself or would like to report a community, please direct them to Lemmy.world Support or email info@lemmy.world. For other questions check our partnered communities list, or use the search function.
Reminder: The terms of service apply here too.
Partnered Communities:
Logo design credit goes to: tubbadu
Sure, the guy was a murderer and somewhat nuts, but this quote of his always rang true with me. This is, in a nutshell, the future: "But I am suggesting neither that the human race would voluntarily turn power over to the machines nor that the machines would willfully seize power. What I do suggest is that the human race might easily permit itself to drift into a position of such dependence on the machines that it would have no practical choice but to accept all of the machines' decisions. Eventually a stage may be reached at which the decisions necessary to keep the system running will be so complex that human beings will be incapable of making them intelligently. At that stage the machines will be in effective control. People won't be able to just turn the machines off, because they will be so dependent on them that turning them off would amount to suicide."
Not exactly an original thought though. This had been a staple of SF writers for decades. E M Forster's The Machine Stops from 1909 being a fine example.
And certainly not the last to do it: that’s essentially the plot of Wall-E.
That 1909 story came up with the -same- conclusion ... complete dependence (with the machine enforcing it). Except, in the story, the Repair machine is malfunctioning.
Speaking of which - the other day I found this video, which might be useful to both those who've read it (or don't have the time). Besides an analysis, it includes some fine SF artwork.
"The Machine Stops by E.M. Forster - Short Story Analysis" https://www.youtube.com/watch?v=k2XXkauk0eU
I remember watching the Netflix series about him. I was shocked because I actually agreed with a lot of what he had to say about the society we're living in right now. We're at the brink of AI being a force that is no longer controllable. We're seeing deepfakes and other shit that will ruin peoples life. And we're moving towards a society where everything can be replaced with code.
I myself have grown up to these changes and it's insane how much our world has changed over the last 30 years. It's shocking.
I think the next hundred years will bring more changes to humanity than the last 10,000 years have. We have devised methods to gaslight ourselves; we're moving into a world where the concept of truth is malleable and unknowable. The machines will get smarter, the rich will get richer. I'll be 66 years old in about a month. I have many more yesterdays than tomorrows. I'm not looking forward to leaving this world, but I'm not particularly interested in being a participant in what comes next.