[-] BigMuffin69@awful.systems 6 points 4 months ago

I will find someone who I consider better than me in relevant ways, and have them provide the genetic material. I think that it would be immoral not to, and that it is impossible not to think this way after thinking seriously about it.

Corporate needs you to find the difference between this^ and our local cult leader is the sun god reborn, it's every woman's duty to carry his seed. It is immoral to deny his divine will.

[-] BigMuffin69@awful.systems 6 points 7 months ago* (last edited 7 months ago)

These kids really think if they pick up some trailer park rock candy they can become Paul Erdos. Hate to say it lads, he was simply built different.

[-] BigMuffin69@awful.systems 6 points 8 months ago* (last edited 8 months ago)

it's a compliment boo 😘 <3 u habibi

[-] BigMuffin69@awful.systems 7 points 9 months ago* (last edited 9 months ago)

Thanks for the info. I had never heard of this one before & didn't know I was shit posting on behalf of Thiel's ilk ;_;

[-] BigMuffin69@awful.systems 6 points 10 months ago* (last edited 10 months ago)

Not prying! Thankful to say, none of my coworkers have ever brought up ye olde basilisk, the closest anyone has ever gotten has been jokes about the LLMs taking over, but never too seriously.

No, I don't find the acasual robot god stuff too weird b.c. we already had Pascal's wager. But holy shit, people actually full throat believing it to the point that they are having panic attacks wtf. Like:

  1. Full human body simulation -> my brother-in-law is a computational chemist, they spend huge amounts of compute modeling simple few atom systems. To build a complete human simulation, you'd be computing every force interaction for approx ~ 10^28 atoms, like this is ludicrous.

  2. The chuckle fucks who are posing this are suggesting ok, once the robot god can sim you (which again, doubt), it's going to be able to use that simulation of you to model your decisions and optimize against you.

So we have an optimization problem like:

min_{x,y} f(x) s.t. y in argmin{ g(x,y),(x,y) in X*Y}

where x and f(x) would be the decision variables and obj function 🐍 is trying to minimize, and y and g(x,y) is the objective of me, the simulated human who has its own goals, (don't get turned to paperclips).

This is a bilevel optimization problem, and it's very, very nasty to solve. Even in the nicest case possible, that somehow g,f, are convex functions and X,Y are all convex sets, (which is an insane ask considering y and g entails a complete human sim), this problem is provably NP-hard.

Basically, to build the acasual god, first you need a computer larger than the known universe, and this probably isn't sufficient.

Weird note: while I was in academia, I actually did do some work on training ANN to model the constraint that y is a minimizer of a follower problem by using an ANN to act as a proxy for g(x,*), and then encoding a representation of the trained network into a single level optimization problem... we got some nice results for some special low dim problems where we had lots of data🦍 🦍 🦍 🦍 🦍

[-] BigMuffin69@awful.systems 7 points 10 months ago

AH THE TSP MOVIE IS SO FUN :)

btw, as a shill for big MIP, I am compelled to share this site which has solutions for real world TSPs!

https://www.math.uwaterloo.ca/tsp/world/

[-] BigMuffin69@awful.systems 7 points 1 year ago

Like a model trained on its own outputs, Geoff has drank his own Kool-Aid and completely decohered.

[-] BigMuffin69@awful.systems 6 points 1 year ago

doing the lorde's work^

[-] BigMuffin69@awful.systems 7 points 1 year ago

I'd never leave Donkey Kong, unless I found someone with 33.3 % (repeating of course) more funk, some sort of funky Kong if you will.

view more: β€Ή prev next β€Ί

BigMuffin69

joined 1 year ago