653
AI Rule (lemmy.zip)
you are viewing a single comment's thread
view the rest of the comments
[-] scott@lemmy.org 24 points 4 days ago

AI does not exist. Large language models are not intelligent, they are language models.

[-] TranscendentalEmpire@lemmy.today 20 points 3 days ago

This can't be true...... Businesses wouldn't reshape their entire portfolios, spending billions of dollars on a technology with limited to no utility. Ridiculous.

Anyways, I got these tulip bulbs to sell, real cheap, just like give me your house or something.

[-] marcos@lemmy.world 9 points 3 days ago

Remember, investment in LLM infrastructure on the US is currently larger than consumer spending.

And they will cut interest rates soon, so expect the number to go up (the investment number, that is, not value).

[-] Captain_Faraday@programming.dev 2 points 2 days ago

Can confirm, I’m an electrical engineer working on a power substation supplying power to a future datacenter (not sure if an Ai project, there’s more that one). Let’s just say, money is no issue, commissioning schedule and functionality are their priorities.

[-] Captain_Faraday@programming.dev 1 points 2 days ago

Can confirm, I’m an electrical engineer working on a power substation supplying power to a future datacenter (not sure if an Ai project, there’s more that one). Let’s just say, money is no issue, commissioning schedule and functionality are their priorities.

[-] yozul@beehaw.org 13 points 3 days ago

The phrase AI has never actually meant that though? It's just a machine that can take in information and make decisions. A thermostat is an AI. And not a fancy modern one either. I'm talking about an old bi-metallic strip in a plastic box. That's how the phrase has always been used outside of sci-fi, where it usually is used to talk about superintelligent general intelligences. The problem isn't that people are calling LLMs AI. The problem is that the billionaires who run everything are too stupid to understand that difference.

[-] ozymandias117@lemmy.world 11 points 3 days ago

I would argue that, prior to chatgpt's marketing, AI did mean that.

When talking about specific, non-general, techniques, it was called things like ML, etc.

After openai coopted AI to mean an LLM, people started using AGI to mean what AI used to mean.

[-] brisk@aussie.zone 2 points 3 days ago

That would be a deeply ahistorical argument.

https://en.wikipedia.org/wiki/AI_effect

AI is a very old field, and has always suffered from things being excluded from popsci as soon as they are achievable and commonplace. Path finding, OCR, chess engines and decision trees are all AI applications, as are machine learning and LLMs.

That Wikipedia article has a great line in it too

The Bulletin of the Atomic Scientists organization views the AI effect as a worldwide strategic military threat.[4] They point out that it obscures the fact that applications of AI had already found their way into both US and Soviet militaries during the Cold War.[4]

The discipline of Artificial Intelligence was founded in the 50s. Some of the current vibe is probably due to the "Second AI winter" of the 90s, the last time calling things AI was dangerous to your funding

[-] ignotum@lemmy.world 5 points 3 days ago

To common people perhaps, but never in the field itself, much simpler and dumber systems than LLMs were still called AI

[-] RandomVideos@programming.dev 3 points 3 days ago

Does that mean that enemy AIs that choose a random position near them and find the shortest path to it are smarter than chatgpt? They have been called AI for longer than i played games with enemies

You can also disprove the argument by just using duckduckgo and filtering from before OpenAI existed https://duckduckgo.com/?q=%22AI%22&df=1990-01-01..2015-01-01&t=fpas&ia=web

[-] Klear@lemmy.world 1 points 3 days ago

Doom enemies had AI 30 years ago.

[-] Honytawk@feddit.nl 1 points 3 days ago

But those weren't generated using machine learning, were they?

[-] Klear@lemmy.world 1 points 3 days ago* (last edited 3 days ago)

So? I don't see how that's relevant to the point that "AI" has been used for very simple decision algorithms since for along time, and it makes no sense to not use it for LLMs too.

[-] Xavienth@lemmygrad.ml 1 points 3 days ago

People have called NPCs in video games "AI" for like, decades.

[-] Bronzebeard@lemmy.zip 1 points 3 days ago

A thermostat is an algorithm. Maybe. Can be done mechanically. That's not much of a decision, "is number bigger?"

[-] yozul@beehaw.org 0 points 2 days ago

Litterally everything we have ever made and called an AI is an algorithm. Just because we've made algorithms for making bigger, more complicated algorithms we don't understand doesn't mean it's actually anything fundamentally different. Look at input, run numbers, give output. That's all there ever has been. That's how thermostats work, and it's also how LLMs work. It's only gotten more complicated. It has never actually changed.

[-] Angry_Autist@lemmy.world 0 points 3 days ago

It's pretty funny you think LLMs are the only implementation of machine learning

this post was submitted on 06 Aug 2025
653 points (96.2% liked)

Programmer Humor

25594 readers
1453 users here now

Welcome to Programmer Humor!

This is a place where you can post jokes, memes, humor, etc. related to programming!

For sharing awful code theres also Programming Horror.

Rules

founded 2 years ago
MODERATORS