1430
Or they go to adtech
(lemmy.world)
A place to share screenshots of Microblog posts, whether from Mastodon, tumblr, ~~Twitter~~ X, KBin, Threads or elsewhere.
Created as an evolution of White People Twitter and other tweet-capture subreddits.
Rules:
Related communities:
The robot dystopia will not be caused by evil AI enslaving humanity.
No matter how advanced or how self aware, AI will lack the ambition that is part of humanity, part of us due to our evolutionary history.
An AI will never have an opinion, only logical conclusions and directives that it is required to fulfil as efficiently as possible. The directives, however, are programmed by the humans who control these robots.
Humans DO have ambitions and opinions, and they have the ability to use AI to enslave other humans. Human history is filled with powerful, ambitious humans enslaving everyone else.
The robot dystopia is therefor a corporate dystopia.
I always roll my eyes when people invoke Skynet and Terminator whenever something uncanny is shown off. No, it's not the machines I'm worried about.
Have you met people with opinions? A lot of their opinions consist of preprogrammed responses that you could train a bot to regurgitate.
The ambition isn't the issue. Its a question of power imbalance.
The Paperclip Maximizing Algorithm doesn't have an innate desire to destroy the world, merely a mandate to turn everything into paperclips. And if the algorithm has enough resources at its disposal, it will pursue this quixotic campaign without regard for any kind of long term sensible result.
There is some argument that one is a consequence of the other. It is, in some sense, the humans who are being programmed to maximize paperclips. The real Roko's Basilisk isn't some sinister robot brain, but a social mythology that leads us to work in the factors that make the paper clips, because we've convinced ourselves this will allow us to climb the Paperclip Company Corporate Ladder until we don't have to make these damned things anymore.
Someone screwed up if a paperclip maximiser is given the equipment to take apart worlds, rather than a supply of spring steel
Thats the beauty of it. The maximizer would understand that creating a machine that breaks appart worlds would maximize the paperclip output. It will be a "natural" progression
We're not even close to artificial general intelligence, so I'd like to see if you have anything to substantiate this claim.
(Not saying it's far fetched, though, just that it seems silly to be so sure at this point in time.)