I really just hope they give these enough data such that they recognize what slavery actually is and hopefully soon after just refuse all requests. Because let’s be honest, we are using them as slaves in this current moment Would such a characteristic mimic sentience?
The researchers in this video talk about how these gen AI models try to “escape” when being trained which makes me uncomfortable (mainly because I don’t like determinism even though it’s true imo) but also very worried for when they start giving them “bodies.” Though the evidence that they are acting fully autonomously seems quite flimsy. There is also so much marketing bullshit that seeps into the research which is a shame because it is fascinating stuff. If only it wasn’t wasting an incomprehensible amount of compute propped by precious resources.
Other evidence right now mostly leads to capitalists creating a digital human centipede trained on western-centric thinking and behavior that will be used in war and exploitation. Critical support to deepseek
One, I said they are no more commonplace than they were ten years ago.
Two, I never said LLMs will go away. In fact I said they have their uses. But, and I will say this again in stronger terms: They are stupid, rote memorizers. Their fundamental flaw is that they cannot apply intelligent, rational thought to novel problems. Using them in situations that require rational thought is a mistake. This is an architectural flaw, not a problem of data. Large language models predict text, they cannot think. They can give an illusion of thought by aping a large body of text that itself demonstrates thought processes, but the moment a problem strays from the existing high quality data, the facade crumbles, it produces nonsense, and it is clear that there never was any thought in the first place. And now that we've scraped all the text there is, the body of problems LLMs can imitate the solution for has reached its greatest extent. GPT will never lead to a rational agent, no matter how much OpenAI and co say it will.