323
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
this post was submitted on 21 Jun 2025
323 points (90.7% liked)
Enshittification
3522 readers
1 users here now
What is enshittification?
The phenomenon of online platforms gradually degrading the quality of their services, often by promoting advertisements and sponsored content, in order to increase profits. (Cory Doctorow, 2022, extracted from Wikitionary) source
The lifecycle of Big Internet
We discuss how predatory big tech platforms live and die by luring people in and then decaying for profit.
Embrace, extend and extinguish
We also discuss how naturally open technologies like the Fediverse can be susceptible to corporate takeovers, rugpulls and subsequent enshittification.
founded 2 years ago
MODERATORS
They don't need to know how computers work if Chromebooks are the only thing in existence.
They also don't need to know how to deal with python dependencies if they can pace their code into AI and say why isn't tkinter working?
Craftsmrn said the same thing about the industrial revolution.
You know how you know even less about computers? When you cannot afford one at all
That's why they only know what Chromebook offers, they have them in school.
My kid's school doesn't have any kind of computer instruction, no computer lab, it's all Chromebooks.
Is it your genuine belief that your schools would have computer instruction and big easily accessible labs if not for Chromebooks?
I remember "teach kids computers" as an educational panacea during the 80s/90s. It made Micheal Dell very rich, but often at the expense of the biology, chemistry, and physics lab programs. "Nobody knows how to use a blowtorch / dissect an animal / build an engine anymore" was a refrain I heard all the through my high school years.
Has eliminating computer labs brought back the old 70s era Space Race science programs? Or are we still just boiling away ever ounce of the public system that costs money (except athletics, of course)?
That's honestly technology in a nutshell. Technological development leads to further abstraction, leading to less low level knowledge. It's always been this way. Is AI an abstraction step too far, or are we just the next generation of old man yelling at cloud?
AI has value but first a reality check. Most of the time it produces code which doesn't work and even if it did is usually of terrible quality, inconsistent style, missing checks, security etc. That's because there is no "thinking" in AI, it's a crank handle using training and some rng to shit out an answer.
If you know what you're doing it can still be a useful tool. I use it a lot but only after carefully reading what it says and understanding the many times it is wrong.
If you don't know how to program everything might look fine. Except when it crashes, or fails on corner cases, or follows bad practice, or drags in bloated 3rd party libs, or runs out of memory on large datasets or whatever. So don't trust anybody who blindly uses it or claims to be a "vibe" programmer since it amounts to admission of an incompetence.