534
you are viewing a single comment's thread
view the rest of the comments
[-] jsomae@lemmy.ml 1 points 2 days ago

I think you're talking about accelerationism. IMO, the main problem with unrestrained AI growth is that if AI turns out to be as good as the hype says it is, then we'll all be dead before revolution occurs.

[-] LovableSidekick@lemmy.world 1 points 1 day ago

The trick is to judge things on their own merit and not on the hype around them.

[-] jsomae@lemmy.ml 1 points 1 day ago

In that case, you should know that Geoff Hinton (the guy whose lab kicked off the whole AI revolution last decade) quit Google in order to warn about the existential risk of AI. He believes there's at least a 10% chance that it will kill us all within 30 years. Ilya Sutskever, his former student and co-founder of OpenAI, believes similarly, which is why he quit OpenAI and founded Safe Superintelligence (yes that basic html document really is their homepage) to help solve the alignment problem.

You can also find popular rationalist AI pundits like gwern, acx, yudkowsky, etc. voicing similar concerns, with a range of P(doom) from low to the laughably high.

[-] LovableSidekick@lemmy.world 1 points 19 hours ago* (last edited 19 hours ago)

Yes I know, the robot apocalypse people seem desperate to be afraid of is always just around the corner. Geoff Hinton, while a definite pioneer in AI, didn't kick anything off, he was one of a large number of people working on it, and one of a small number predicting armageddon.

[-] jsomae@lemmy.ml 1 points 19 hours ago

The reason it's always just around the corner is because there is very strong evidence we're approaching the singularity. Why do you sound sarcastic saying this? What probability would you assign to an AI apocalypse in the next three decades?

Geoff Hinton absolutely kicked things off. Everybody else had given up on neural nets for image recognition, but his breakthrough renewed interest throughout the world. We wouldn't have deepdreaming slugdogs without him.

It should not be surprising that most people in the field of AI are not predicting armageddon, since it would be harmful to their careers to do so. Hinton is also not predicting the apocalypse -- he's saying 10-20% chance, which is actually a prediction that it won't happen.

[-] LovableSidekick@lemmy.world 1 points 42 minutes ago

I'm sarcastic because I would assign the same probability as a zombie apocalypse. At the nuts and bolts level I think they're both technically flawed on a Hollywood fantasy level.

What does an AI apocalypse even look like to you? Computers launching nuclear missiles or what? Shutting down power grids?

this post was submitted on 16 May 2025
534 points (94.6% liked)

Memes

50360 readers
897 users here now

Rules:

  1. Be civil and nice.
  2. Try not to excessively repost, as a rule of thumb, wait at least 2 months to do it if you have to.

founded 6 years ago
MODERATORS