ah yeah, 3 downvotes (and one of them’s mine) and zero replies of scorn directed towards you (well, one now)
how about you take your bullshit elsewhere
ah yeah, 3 downvotes (and one of them’s mine) and zero replies of scorn directed towards you (well, one now)
how about you take your bullshit elsewhere
you are downplaying how impossible the requirements were.
oh absolutely! but only out of a sense of shame for being in a career field where a medical device company posting that horseshit compression challenge didn’t immediately prompt a strong backlash and repercussions for neuralink’s ability to attract and retain talent, in lieu of a functioning regulator maybe possibly shutting them down before they can fucking mutilate someone else with this brainfart of an invention
I feel bad for anyone who gets that e-waste implanted into their head and ends up with an implant that absolutely cannot do the things it’s marketed to do, barely does ordinary 90s brain implant shit, stops working very quickly (to the apparent surprise of the people in charge) and will most likely cause injury and severe discomfort to the patients saddled with it
I wish my field had ethics. I’d sleep better if we did.
and since we’ve confirmed you’re a fucking idiot… holy shit you were serious about that “you’re assuming P!=NP” post in the last thread?
“halt your conversation and explain yourself to me, whoever the fuck I am, right this instant” is also an utter shithead move on mastodon
if my real-life friends were cooler, I’d have already arranged a watch party, drinking game, and post-sentencing celebratory cocktails
fuck it, we’re doing cocktails
The SBF
from the orange site thread:
Neural networks are not new, and they're just mathematical systems. LLMs don't think. At all. They're basically glorified autocorrect. What they're good for is generating a lot of natural-sounding text that fools people into thinking there's more going on than there really is.
Obvious question: can Prolog do reasoning?
If your definition of reasoning excludes Prolog, then... I'm not sure what to say!
this is a very specific sneer, but it’s a fucking head trip when you’ve got in-depth knowledge of whichever obscure shit the orange site’s fetishizing at the moment. I like Prolog a lot, and I know it pretty well. it’s intentionally very far from a generalized reasoning engine. in fact, the core inference algorithm and declarative subset of Prolog (aka Datalog) is equivalent to tuple relational calculus; that is, it’s no more expressive than a boring SQL database or an ECS game engine. Prolog itself doesn’t even have the solving power of something like a proof assistant (much less doing anything like thinking); it’s much closer to a dependent type system (which is why a few compilers implement Datalog solvers for type checking).
in short, it’s fucking wild to see the same breathless shit from the 80s AI boom about Prolog somehow being an AI language with a bunch of emphasis on the AI, as if it were a fucking thinking program (instead of a cozy language that elegantly combines elements of a database with a simple but useful logic solver) revived and thoughtlessly applied simultaneously to both Prolog and GPT, without any pause to maybe think about how fucking stupid that is
and often the fact that you hear about it at all means the movement is fumbling it less than other movements that would keep it quiet
I just can’t get over how far this is from reality. like fuck, for a lot of these things the controversy is the community covering for the abuser, or evidence coming out that sexual harassment was covered up in the past. depressingly often in tech, the community doesn’t even try to keep it quiet; instead they just loudly endorse the abuser or talk about how there’s nothing they can do.
/r/SneerClub
these people are so terrified of Satan Church that they haven’t read the change of address we stabbed into the front door with a ceremonial dagger like 5 months ago
wait, so the AI is just your fears about capitalism?
Same elementary school logic but I mean this is how a nuke works.
what. no it isn’t
look I don’t want to shock you but that’s basically what they get paid to do. and (perverse) incentives apply - of course goog isn’t just going to spend a couple decabillion then go “oh shit, hmm, we’ve reached the limits of what this can do. okay everyone, pack it in, we’re done with this one!”, they’re gonna keep trying to milk it to make some of those decabillions back. and there’s plenty of useful suckers out there
a lot of corporations involved with AI are doing their damndest to damage our relationship with the scientific process by releasing as much fluff disguised as research as they can manage, and I really feel like it’s a trick they learned from watching cryptocurrency projects release an interminable amount of whitepapers (which, itself, damaged our relationship with and expectations from the engineering process)
that seems about right, and “this anti-cult information source isn’t actually anti-cult, it’s a competing cult you should avoid” is a pretty common form it can take. it’s very convenient for the cultists, because it defuses criticism without engaging with it, by keeping all thought within the framework of the cult. the idea that we all organically stumbled upon Rationalist ideas (or were exposed to them through our friends or industry) and wholesale rejected them, must be eliminated as a possibility. we must have an ulterior motive that can’t be summed up as “hahaha holy shit look at these assholes” — or else the cult has to accept that a lot of people legitimately hold the idea that Rationalists and Zizians and all of Yud’s other ideological children are in fact fucking assholes.