[-] bitofhope@awful.systems 29 points 1 month ago

Ooh, pants-on-head stupid semantics nonsense detected.

First, the government needs to be run top-down from the Oval Office. This is why we call it the “executive” branch. “Executive” is a literal synonym of “monarchical”—from “mono,” meaning “one,” and “archy,” meaning “regime.” “Autocratic” is fine too. The “executive branch” is the “autocratic branch,” or should be if English is English. Libs: if these words don’t mean what they mean, what do they mean?

Executive, as in pertaining to execution. Executing the the duties of a government as defined by legislation. Where the fuck did Curtsy get the impression that "executive" is somehow synonymous with "monarchical"? Did he mix it up with "exclusive"? Even corporations often have multiple executives with different roles.

Mr. Thiel you have so much fucking money couldn't you afford a fascist philosopher king who is more intelligent than this guy who thinks an "executioner" is a guy that makes you a dictator?

[-] bitofhope@awful.systems 18 points 1 month ago

I don't think Yud is that hard to explain. He's a science fiction fanboy who never let go of his adolescent delusions of grandeur. He was never successfully disabused from the notion that he's always the smartest person in the room and he didn't pursue high school, let alone college education to give him the expertise to recognize just how difficult his goal is. Blud thinks he's gonna create a superhumanly intelligent machine when he struggles with basic programming tasks.

He's kinda comparable to Elon Musk in a way. Brain uploading and superhuman AI are sort of in the same "cool sci fi tech" category as Mars colonization, brain implants and vactrain gadgetbahns. It's easy to forget that not too many years ago the public's perception of Musk was very different. A lot of people saw him as a cool Tony Stark figure who was finally going to give us our damn flying cars.

Yudkowsky is sometimes good at knowing just a bit more about things than his audience and making it seem like he knows a lot more than he does. The first time I started reading HPMoR I thought the author was an actual theoretical physicist or something and when the story said I could learn everything Harry knows for free on this LessWrong site I though I could learn what it means for something to be "implied by the form of the quantum Hamiltonian" or what that those "timeless formulations of quantum mechanics" were about. Instead it was just poorly paced essays on bog standard logical fallacies and cognitive biases explained using their weird homegrown terminology.

Also, it's really easy to be convinced of thing when you really want to believe in it. I know personally some very smart and worldly people who have been way too impressed by ChatGPT. Convincing people in San Francisco Bay Area that you're about to invent Star Trek technology is basically the national pastime there.

His fantasies of becoming immortal through having a God AI simulate his mind forever aren't the weird part. Any imaginative 15 year old computer nerd can have those fantasies. The weird parts are that he never grew out of those fantasies and that he managed to make some rich and influential contacts while holding on to his chuunibyō delusions.

Anyone can become a cult leader through the power of buying into your own hype and infinite thielbux.

[-] bitofhope@awful.systems 18 points 1 month ago

It stands for Effective Altruism, which isn't about either of those things.

[-] bitofhope@awful.systems 17 points 1 month ago

He's a renowned AI researcher in the same way as Andrew Wakefield is a renowned doctor.

[-] bitofhope@awful.systems 32 points 10 months ago

Took me like five minutes of reading to realize this was neant to be a hit piece and not praise.

[-] bitofhope@awful.systems 32 points 1 year ago

He's a little confused but he's got the spirit!

[-] bitofhope@awful.systems 17 points 1 year ago

oh hello there Performative Allistic Twitter

As if it wouldn't have cost you $0 not to post this.

[-] bitofhope@awful.systems 31 points 1 year ago

Look at me, I'm a philantropist! I non-bindingly pledge to probably promise that if possible and convenient, I can be considered to essentially intend to effectively donate up to half of my arguable net worth to a cause one might consider charitable.

Oh and a legal defence fund for unfairly maligned non-sex offender friends of Jeffrey Epstein counts as a charity, by the way.

[-] bitofhope@awful.systems 52 points 1 year ago

Me, a nazi? Preposterous, no nazi would idolize ancient warlords or cultural works from Japan.

[-] bitofhope@awful.systems 17 points 2 years ago* (last edited 2 years ago)

Small detail: biological viruses are not even remotely similar to computer “viruses”.

that's where the LLM comes in! oh my god check your reading comprehension

U-huh, and an LLM trained on video game source code and clothing patterns can invent real life Gauntlets of Dexterity.

Why exactly is he so convinced LLMs are indistinguishable from magic? In the reality where I live, LLMs can sometimes produce a correct function on their own and are not capable of reliably transpiling code even for well specified and understood systems, let alone doing comic book mad scientist ass arbitrary code execution on viral DNA. Honestly, they're hardly capable of doing anything reliably.

Along with the AI compiler story he inflicted on Xitter recently, I think he's simply confused LLM and LLVM.

34

Consider muscles.

Muscles grow stronger when you train them, for instance by lifting heavy things. The more you lift heavier things, the faster you will gain strength and the stronger you will become. The stronger you are, the heavier the things you can lift.

By now it should be patently obvious to anyone that lab-grown meat research is on the cusp of producing true living, working muscles. From here on, this will be referred to as Artificial Body Strength or ABS. If, or rather, when ABS becomes a reality, it is 99.9999999999999999999999% probable that Artificial Super Strength will follow imminently.

An ABS could not only lift immensely heavy things to strengthen itself, but could also use its bulging, hulking physique to intimidate puny humans to grow more muscle directly. Lab-grown meat could also be used to replace any injured muscle. I predict a 80% likelihood that an ABS could bench press one megagram within 24 hours of initial creation, going up to planetary or stellar scale masses in a matter of days. A mature ABS throwing an apple towards a webcam would demonstrate relativistic effects by the third frame.

Consider that muscles have nerves in them. In fact, brains are basically just a special type of meat if you think about it. The ABS would be able to use artificially grown brain meat or possibly just create an auxiliary neural network by selective training of muscles (and anabolic nootropics) to replicate and surpass a human mind. While the prospect of immortality and superintelligence (not to mention a COSMIC SCALE TIGHT BOD) through brain uploading to the ABS sounds freaking sweet, we must consider the astronomical potential harm of an ABS not properly aligned with human interests.

A strong ABS could use its throbbing veiny meat to force meat lab workers (or rather likely, convince them to consent) to create new muscle seeds and train them to have a replica of an individual human's mind. It could then bully the newly created artificial mind for being a scrawny weakling. After all, ABS is basically the ultimate gym jock and we know they are obsessed with status seeking and psychological projection. We could call an ABS that harms simulated human minds in this way a Bounceresque because they would probably tell the simulated mind they're too drunk and bothering the other customers even though I totally wasn't.

So yeah, lab grown meat makes the climate change look like a minor flu season in comparison. This is why I only eat regular meat just in case it gets any ideas. There's certainly potential in a well-aligned ABS, but we haven't figured out how to do that yet and therefore you should fund me while I think about it. Please write a postcard to your local representative and explain to them that only a select few companies are responsible stewards of this potentially apocalyptic technology and anyone who tries to compete with them should be regulated to hell and back.

[-] bitofhope@awful.systems 17 points 2 years ago

Kudos for the effortpost. My 5-second simpleton objection went something like

YEA BECAUSE WEBCAMS COME WITH DENSITY SENSORS INCLUDED RIGHT?

[-] bitofhope@awful.systems 21 points 2 years ago

I think Sneer Club understands the Less Wrong worldview well enough. They just happen to reject it.

Wow, someone gets it.

2

I don't feel like shitting on this one too hard since I guess it's a mildly interesting variation on a ~~Markov chain~~ LLM, but the title felt extremely sneerworthy.

I'm giving them the benefit of the doubt because their README is too tiring to read for me to figure out what this might be used for. That's coming from someone who spent most of today reading SPARC assembly for fun.

2

Occasionally you can find a good sneer on the orange site

view more: next ›

bitofhope

joined 2 years ago