994
Over half of all tech industry workers view AI as overrated
(www.techspot.com)
This is a most excellent place for technology news and articles.
This abstract thinking… is pattern recognition. Patterns of behavior, patterns of series of actions, patterns of photons, patterns of patterns.
And there is one, I think only, concept of consciousness. And it is that it’s another layer of pattern recognition. A pattern recognizer that looks into the patterns of your own mind.
I’m unfortunately unsure how else to convey this because it seems so obvious to me. I’d need to take quite some time to figure out how to explain it any better.
Please do, but I don't understand why you believe that it changes things? Pattern recognition is the modus operandi of a brain, or rather the connection between your senses and your brain. So perhaps could be seen more like the way "brain data" is stored, its data type.The peculiarity is how the data type is used.
This may turn philosophical, but consider you would have the perfect pattern recognition apparatus. It would see one pattern, the ultimate pattern how everything is exactly connected. Does that make it intelligent?
To be called intelligent, you would want to be able to ask the apparatus about specific problems (much smaller chunks of the whole thing). While it may still be confined to the data type throughout the whole process, the scope of its intelligence would be defined by the way it uses the data.
See, I like this question, “what is intelligence?”
I feel way too many people are so happy to make claims about what is or isn’t intelligent without ever attempting to define intelligence.
Honestly, I’m not sure what constitutes “intelligence”, the best I can come up with is the human brain. But when I try to differentiate the brain from a computer, I just keep seeing all the similarities. The differences that are there, seem reasonable to expect a computer to replicate… eventually.
Anyway, I’ve been working off of the idea that all that reacts to stimulus is intelligent. It’s all a matter of degree and type. I’m talking bacteria, bugs, humans, plants, maybe even planets.
I've had exactly this discussion with a friend recently. I share your opinion, he shared what seems to be the view of the majority here. I just don't see what the qualitative difference between the brain and a data-based AI would be. It almost seems to me like people have problems accepting the fact that they're not more than biological machines. Like there must be something that makes them special, that gives them some sort of "soul" even when it's in a non-religious and non-spiritual way. Some qualitative difference between them and the computer. I don't think there necessarily is one. Look at how many things people get wrong. Look at how bad we are at simple logic sometimes. We have a better sense of some things like plausibility because we have a different set of experiences that is rooted in our physical life. I think it's entirely possible that we will be able to create robots that are more similar to human beings than we'd like them to be. I even think it's possible that they would have qualia. I just don't see why not.
I know that there is a debate about machine learning AI and symbolic AI. I'm not an expert to be fair, but I have not seen any possible explanation as to why only symbolic AI would be "true" AI, even though many people seem to believe that.