and now my life has been reduced from being perhaps the best engineer in the world
Yeah... I think we dodged a bullet when Linus didn't let his filesystem in.
and now my life has been reduced from being perhaps the best engineer in the world
Yeah... I think we dodged a bullet when Linus didn't let his filesystem in.
Was thinking you know its bad when Linus thinks you're too toxic...
Which Linus are you referring to, exactly? Because the toxic one is the YouTuber and the comment seems to refer to Torvalds.
Torvalds literally had to take a hiatus because of how toxic he'd gotten. Luckily he didn't deny it when it was brought to his attention and he's gotten better now.
Does it do anything that isn't in response to a human's prompting? No? Then it can't be conscious. Consciousness requires having a sense of self, which implies having needs and desires that one acts to fulfill without needing prompting. Even a bacterium is more conscious than these things.
Nowhere in your unquoted definition do you state that the ‘sense of self’ must be present at all times. Humans can switch between conscious and unconscious states. When they’re unconscious they don’t have needs and desires.
tell that to the stains on my sheets, buddy.
fully conscious according to any test I can think of
There's no such thing as an actual test for consciousness in machines. We can do tests on animals to see if their sensory experience includes self-awareness, but we're already operating on the assumption that they have feelings and sensory experience because they have a brain and nervous system like us, and they're all directly related to us (as all organisms are). But that's totally different from designing a machine which mimics (or predicts/auto-completes) our observable behavior and then assuming that it "doesn't like" something or does anything "for fun."
What sucks is that some idiots are going to start falling for this. And eventually software will be given human rights, which actually means that the software's owners will have extra rights compared to the rest of us.
Yeah, that's as far as I've been able to go, thinking about this. To me, it's clear that non-human animals are conscious. But, we treat them like raw materials, for reasons which fall apart immediatly in debate. AI might not be conscious the way a pig or a duck is. But it seems more conscious than a cup of sand or a box of crayons.

"Seems" being the operating word here. But children think that Muppets are conscious. People lose their temper at self-checkout machines. Faithful of different religions attribute will and power to all sorts of idols and other inanimate objects like supposed fragments of a specific cross. The most famous work of fantasy fiction is about a malevolent piece of jewelry. Humans are very good at attributing consciousness to non-conscious entities. We are easily fooled in this respect.
Even if some putative AI may be conscious, an LLM is just something that looks up words in a database with probability weights attached. This technology cannot lead to consciousness.
Are you talking about The Pearl, by chance? It's one I haven't read, yet, but if you're talking about another story, I'd like to read that, too!
I was referring to The Lord of the Rings.
This is a most excellent place for technology news and articles.