275
Fate, it seems, is not without a sense of irony
(literature.cafe)
1. Be civil
No trolling, bigotry or other insulting / annoying behaviour
2. No politics
This is non-politics community. For political memes please go to !politicalmemes@lemmy.world
3. No recent reposts
Check for reposts when posting a meme, you can only repost after 1 month
4. No bots
No bots without the express approval of the mods or the admins
5. No Spam/Ads/AI Slop
No advertisements or spam. This is an instance rule and the only way to live. We also consider AI slop to be spam in this community and is subject to removal.
A collection of some classic Lemmy memes for your enjoyment
When I heard that line I was like "Yeah, sure. We'll never have AI in my lifespan" and you know what? I was right.
What I wasn't expecting was for a bunch of tech bros to create an advanced chatbot and announce "Behold! We have created AI, let's have it do all of our thinking for us!" while the chatbot spits out buggy code and suggests mixing glue into your pizza sauce.
Unless you just died or are about to, you can't really confidently make that statement.
There's no technical reason to think we won't in the next ~20-50 years. We may not, and there may be a technical reason why we can't, but the previous big technical hurdles were the amount of compute needed and that computers couldn't handle fuzzy pattern matching, but modern AI has effectively found a way of solving the pattern matching problem, and current large models like ChatGPT model more "neurons" than are in the human brain, let alone the power that will be available to them in 30 years.
There's no technical reason to think we will in the next ~20-50 years, either.
Was it? I thought it was always about we haven't quite figure it out what thinking really is
Other than that nobody has any idea how to go about it? The things called "AI" today are not precursors to AGI. The search for strong AI is still nowhere close to any breakthroughs.
I work in the gaming industry and every week I receive emails about how AI is gonna revolutionize my job and get sent to time wasting training about how to use Figma AI or other shit like that because it's the best thing ever according to HR... and it never is obviously.
At best, it's gonna make middle managing jobs easier but for devs like me, as long as the "AI" stays out of our engines and stays into the equivalent of cooperative vision boards, it does nothing for me. Not once have I tried to use it for it to turn actually useful. It's mediocre at best and I can't believe there are game devs that actually try to code with it, can't wait to see these hot garbage products come on the market.
I was surprised how poorly they still did as a chatbot vs ELIZA over after 50 years of potential progress and how revered they are in certain contexts.
https://www.researchgate.net/publication/375117569_Does_GPT-4_Pass_the_Turing_Test
Given the baseline is 66% the GPT-4 results are fairly impressive
You won't have general purpose true AI until it can actually think and reason, llm will never do that. At most they would be a way of interaction with an AI.
AI is an umbrella term that covers many things we've already had for a long time, including things like machine learning. This is not a new definition of AI, it's always been this definition.
You’re not going to achieve AI on classical computers and is simply rebranded for machine learning like how 5G was advertised to bring futuristic utopia back in 2020 only to have 4K being considered a premium feature behind paid subscriptions from 𝕏 (Twitter) to YouTube.
Quantum Computers do exist but it’s far from being on the palm of your hand.
Machine learning is a subcategory of AI, which is exactly what I mean by my previous comment.
You're confusing AI and AGI: https://en.wikipedia.org/wiki/AI_effect
AGI is what people mean, when they say "AI doesn't exist": https://en.wikipedia.org/wiki/Artificial_general_intelligence
While AI is a program that can do a task associated with human intelligence: https://en.wikipedia.org/wiki/Artificial_intelligence
AI is not supposed to be an artificial human being. AI just does a task that people associated with humans (before they readjusted the definition of intelligence after it being created).
A bot that plays chess is an AI.
It used to be that AI was AI and then when AI was coopted by the stupid they had to come up qith AGI
No, it just wasn't mainstream to talk about because the average person didn't encounter AI
People playing video games have been differentiating between AI and AGI for over 50 years, though, considering enemies in video games are all AI
Ya know, I can remember AI being used even in hella old games for enemies.
Which had even less to do with AI than LLM.
I genuinely do not understand these very obviously biased comments. By the very definition of AI, we have had it for decades, and suddenly people say we don't have it? I don't get it. Do you hate LLMs so much you want to change the entire definition for AI (and move it under AGI or something)? This feels unhinged, disconnected from reality, biases so strong it looks like delusions
What is delusional is calling a token generator intelligent. These programs don't know what the input is, nor do they understand what they put out. They "know" that after this sequence of tokens, what a likely successive token is based on previously supplied data.
They understand nothing. They generate nothing new. They don't think. They are not intelligent.
They are very cool, very impressive and quite useful. But intelligent? Pffffffh
Why is it so hard for you to understand word "artificial"? It seems like you even avoid it. Just like artificial everything, especially weed and flavours, it's not the real thing, and was never meant to be the real thing, and yet you're essentially an old man yelling at cloud because something is artificial and does not act like the real human intelligence
Hey, real quick, what has the thing controlling the enemies in video games been called for 50 years and would you equally call that delusional, or are you just specifically butthurt at LLMs?