[-] BlueMonday1984@awful.systems 2 points 2 days ago

…You know, if I actually believed in the whole AGI doom scenario (and bought into Eliezer’s self-hype) I would be even more pissed at him and sneer even harder at him. He basically set himself up as a critical savior to mankind, one of the only people clear sighted enough to see the real dangers and most important question… and then he totally failed to deliver. Not only that he created the very hype that would trigger the creation of the unaligned AGI he promised to prevent!

As the cherry on top of this shit sundae, the bubble caused by said hype dealt devastating damage to the Internet and the world at large in spite of failing to create the unaligned AGI Yud was doomsaying about, and made people more vulnerable to falling for the plagiarism-fueled lying machines behind said bubble.

[-] BlueMonday1984@awful.systems 6 points 3 days ago

Now we need to make a logic puzzle involving two people and one cup. Perhaps they are trying to share a drink equitably. Each time they drink one third of remaining cup’s volume.

Step one: Drink two-thirds of the cup's volume

Step two: Piss one sixth of the cup's volume

Problem solved

[-] BlueMonday1984@awful.systems 13 points 3 days ago

Two ferrymen and three boats are on the left bank of a river. Each boat holds exactly one man. How can they get both men and all three boats to the right bank?

Officially, you can't. Unofficially, just have one of the ferrymen tow a boat.

16
submitted 1 month ago* (last edited 1 month ago) by BlueMonday1984@awful.systems to c/sneerclub@awful.systems

New Rolling Stone piece from Alex Morris, focusing heavily on our very good friends and the tech billionaires they're buddies with.

(Also, that's a pretty clever alternate title)

[-] BlueMonday1984@awful.systems 34 points 1 month ago

Found a primo response in the replies:

[-] BlueMonday1984@awful.systems 23 points 1 month ago

Hot take: A lying machine that destroys your intelligence and mental health is unsafe for everyone, mentally ill or no

35

A nice and solid mockery of just how badly e/accs derailed their own plans by getting Trump elected. I'll let the subtitle(?) speak for itself:

Effective accelerationists didn’t just accidentally shoot themselves in the foot. They methodically blew off each of their toes with a .50 caliber sniper rifle.

[-] BlueMonday1984@awful.systems 17 points 4 months ago

It is technically correct to call Yud a "renowned AI researcher", but saying someone's renowned in a pseudoscience such as AI is hardly singing their praises.

16

A solid piece on AI and ethics (and the general lack of them), featuring a nice sideswipe at our very good friends.

[-] BlueMonday1984@awful.systems 22 points 6 months ago

Personally, my money's on them being thoroughly lost in the AI safety sauce - the idea of AI going rogue has been a staple in pop culture for quite a long time (TV Tropes lists a lot of examples), and the relentless anthropomorphisation of LLMs makes it pretty easy to frame whatever fuck-ups they make as a sentient AI pulling malicious shit.

And given man's long and storied history of manipulating and misleading their fellow man, I can see plenty of opportunity for fuck-ups baked directly into your average LLM's training data.

[-] BlueMonday1984@awful.systems 19 points 10 months ago

And less time than her partner-in-crime, who went out of his way to show a complete lack of remorse and a strong willingness to reoffend, all whilst lying out his ass to basically everyone.

Honestly, SBF's antics probably helped make Ellison look like her role in this was much smaller than it actually was

[-] BlueMonday1984@awful.systems 21 points 10 months ago

oh yeah she got two years’ jail for her part in stealing eleven fucking billion with a B dollars

A slap on the wrist compared to Sam Bankman-Fried's twenty-five years. Turns out working with the feds, admitting your guilt and knowing when to fold 'em are pretty important when karma finally catches up with you.

1
1
1
submitted 11 months ago* (last edited 11 months ago) by BlueMonday1984@awful.systems to c/techtakes@awful.systems
24

Damn nice sneer from Charlie Warzel in this one, taking a direct shot at Silicon Valley and its AGI rhetoric.

Archive link, to get past the paywall.

[-] BlueMonday1984@awful.systems 23 points 1 year ago

or is there some weird-ass chess proxy-fixation among the rats that I have thus far been blessedly unaware of?

Gonna take a shot in the dark and say the fixation's from viewing chess more as an IQ showcase than as a game.

[-] BlueMonday1984@awful.systems 33 points 1 year ago

Do you really think "cult" is a useful category/descriptor here?

My view: things identified as "cults" have a bunch of good traits. EA should, where possible, adopt the good traits and reject the bad ones, and ignore whether they're associated with the label "cult" or not.

Yes, this is real

16
[-] BlueMonday1984@awful.systems 20 points 1 year ago

I'd bet good money any judge reading this is gonna go "Yep, this kid does not want to change" and ramp up the sentence as a result.

view more: next ›

BlueMonday1984

joined 2 years ago