its crazy that we treat the most unbelievably advanced machines most people will ever have direct access to as completely disposable
Well that's certainly part of why they are disposable, it's much harder to repair and reuse
ive never considered that but you’re very right
Outside of gaming the requirement for new hardware literally only seems to cover the endless abstraction slop that businesses use to push out more features faster. Meanwhile the product I’m using hasn’t changed at all in the last 10 years, continues to have the same bugs yet now requires an nvme ssd and 500 core 800 watt cpu to run to deliver the same functionality as 10 years ago.
I can’t think of a single thing literally ANY of the software that I use has changed in any meaningful way since the 90s other than getting bigger and slower.
Actually games aren’t even immune from this Unreal Engine is prime abstraction slop because shitting out crap matters more than making anything good. God forbid anyone get the time to make their own engines no just ram a barely working open world game into an engine that barely supports rendering more than 3 people on the screen without shitting the bed.
Yes mr nvidia sir i will buy you 1000 watt electric heater to run this mid ass game that hasn’t innovated on shit all since the ps2 era. Ray tracing you say? Wow this really enhances my experience of point to point repetitive collect 20 boar arse mmo ass gameplay ass shit with AI that’s still as rudimentary as guards in hitman 2 silent assassin
Browsers have gone from document renderers to virtual machine app platforms over the last 30 years. It's a significant change, even if mostly not good.
And yet my use of the internet hasn’t changed, gamefaqs, blogs, forums and YouTube. All stuff that rendered just fine before I needed to be shipped 900gb of react bollocks. My desire for web apps is in the negative they almost always feel worse than native applications but those are a dead art.
Nothing browsers have done can I point to having improved my life and nothing that has been done couldn’t already be done better elsewhere.
Full agreement from me.
I can’t think of a single thing literally ANY of the software that I use has changed in any meaningful way since the 90s other than getting bigger and slower.
Lotus notes got basically killed so that's good
New outlook is faster and more reliable at the cost of crippled or removed basic
functionality
We have good videoconferencing now which we definitely couldn't do before. Miss me with that Teams bullshit, but hit me with that group family Facetime.
It’s very deliberate and forced, the software side justifications for requiring mid-to-high tier hardware produced in the last 5-6 years are incredibly weak.
Entirely related, hardware from said time period also suffers increasingly from planned obsolescence and expensive upselling.
Videogames and their communities are by far the worst offenders. Had a friend see me playing a game on my laptop at 720p resolution the other day and they pretended to be offended by it. "that's unplayable!" Like, says who? Tech influencers paid to set your expectations too high?
ive been this friend tbh, and for good reason. a friend of mine uses an old m1 macbook to play everything and she will sometimes complain about input lag. i tried her setup and no joke, it had about 50-80ms of extra input delay when running rhythm games through ARM wine. it was actually comical and a bit silly.
This is way worse with . A few years is ancient to a lot of them to play some unoptimized slop made with unreal engine on the lowest settings.
ue5 is the bane of gaming to me. there’s not a single ue5 game that runs well. the entire engine is nothing but diminishing gains. “here we made lighting look slightly better but you need 16gb of VRAM and upscaling to make it work even with this setting off anyway”
i see people on reddit talking about gpus like anything that isnt the top top top of the line is “mid tier” and shouldn’t expect good frame rates at high settings.
I hate UE5, but so far STALKER 2 is the only game I haven't been able to get running in an acceptable manner. Stutters and all that shit persists in UE5 no matter what I do, but with STALKER, even if I accept a (barely) consistent 30fps, its still a blurry, ugly mess. Granted I know they patched it a couple times since launch, but as far as I'm concerned the game isn't even coming out for another year. Maybe by then they'll get the fuzz off the trees.
thats not one i’ve tried personally but yeah thats most of what i hear about it. more than “finally a new stalker game! its awesome!” its just people complaining that it runs like shit because the engine sucks. apparently if you can get beyond that there’s fun to he had tho
I put about 20 hours into it and then there were like 4 game breaking bugs that there was no workaround for. Not reinstalling until the game is, you know. Finished.
Ngl, if you have computer hardware 10-15 years old, be prepared to replace it anyway, that's about the longest I've kept a GPU alive. Though I do otherwise agree. I have an AMD rx480 that is a perfectly serviceable card, except that AMD dropped support for it several years ago. Moreover, it's been 10 years and AMDs drivers are still ass. Every year I go 'oh, this will be the year of the AMD GPU' and every year I am shockingly wrong. My Nvidia rtx3060 is starting to show it's age a bit. Not too much, but enough that I am considering an upgrade. I wouldn't care so much of I could use the 480 for compute but nope.
The 30 series cards still support 32 bit phys-x, so if you get a newer one you can still use your old one to run mirrors edge glass etc. the nvidia drivers even support splitting certain loads off onto certain cards…
Won't fix the people saying this stuff, but in case you werent aware there are some specific linux distros that are designed specifically to cater to older hardware. antiX Linux for example. Based on debian stable, and goes the extra mile to support older hardware, and 32 bit systems.
That's a good shout, but I'm not even talking about me specifically. I've seen so many posts of people asking for tech support for things like how to get old closed source nvidea drivers working on the latest linux kernel, or what to do to make linux somewhat usable on a 1gb ram intel atom notebook (in both of these examples I'm pretty sure these were the persons daily computer and they couldn't afford better) now I got tech bros cheering on firefox dropping 32bit support.
Like if people don't know how to help, fine, but having 10 people just tell them to upgrade isn't really helpful.
To play devils advocate, it can be really hard to maintain an ever increasing list of hardware to support. Especially when new hardware have new features that then require special care to handle between devices, and further segment feature sets.
To support every device that came before it, also requires devs to have said hardware to test with, lest they release a buggy build and get more complaints. If devs are already strapped for time on the new and currently supported devices, spending any more time to ensure compatibility with a 10-15 year old piece of hardware that has a couple dozen active users is probably off the table.
See 32bit support for Linux. I get why they're dropping support but I also don't like it. In Linux, there's basically a parallel set of libraries for i386. These libraries need maintenance and testing to ensure they're not creating vulnerabilities. As the kernel grows, it essentially doubles any work to ensure things build and behave correctly for old CPUs, CPUs that may lack a lot of hardware features that have come to be expected today.
People also will like to say "oh but xyz runs just as good/bad as my 15-20 year old computer, why do we need NEW???". Power, it's power use. The amount of compute you get for the amount of power for modern chipsets is incomparable. Using old computers for menial tasks is essentially leaving a toaster oven running to heat your room.
Yeah, this really seems like a problem inherent to capitalist production. New hardware doesn't need to release every year. We could cut back on a lot of waste if R&D took longer and was geared toward longevity and repairing rather than replacing. Unfortunately, all the systems of production were set up when Moor's Law was still in full swing, so instead we're left with an overcapacity of production while at the same time approaching the floor component miniaturization as Newton gives way to Schrödinger.
I agree with you completely.
I do want to add that, I'd wager this overall scenario would be present regardless how long the release schedules are. Especially since people are wanting to continue to use their 10-15 year old hardware.
It's somewhat frustrating that many useful features are also in opposition to repair-ability. Most performance improvements we've seen in the last 10 years basically require components to get closer and closer. Soldered RAM isn't just the musings of a madman at Apple, it's the only way to control noise on the wires to get the speeds the hardware is capable of. Interchangeable parts are nice but every modular component adds a failure point.
Add in things like say, a level of water resistance. Makes the device way more durable to daily life at the expense of difficult repairs. Or security, it's great grandma won't get her phone hacked but now we can't run our own code.
There's also a bit of an internet-commie brainworm that I'm still trying to pin down. Like you said, "We could cut back on a lot of waste if R&D took longer and was geared toward longevity and repairing rather than replacing", what does this actually look like? I think it's at odds with how most of us use technology. Do we want somewhat delicate, fully modular, bulky devices? What does it mean to be repairable if the entire main-board is a unit? If you need to make each component of the main board modular, the device will quadruple in size, making it overall worse to use than the small disposable device(more expensive too). The interconnects will wear out, making modules behave in unexpected ways. The level of effort required to support a dozen interconnected modules for years and years would be substantial. Not only that, the level of expertise to repair things at a level below a plug-and-play module is far higher.
I had a small wireless dongle die on me after about a year of use. It basically stopped connecting to my phone. I noticed that it would connect for a short period before disconnecting. When the device hadn't been used in a while, this period was longer. Due to my own repair experience, I knew this was a product of a cracked solder joint and expansion due to heat. I brought it into work and held a reflow gun to it for 5 or so minutes, heating up the PCB enough to reflow whatever bad joint was causing the issue. I looked this up after the fact, and found another person had found the same solution, basically told people to put the whole device in the oven for a period and hope it fixed it. People commented in disbelief that this seemingly magical ritual could revive such a black-hole of a device. They couldn't comprehend how someone could come across this solution. Of course it wasn't magic, it was fairly simple if you have encountered thermal expansion, cracked solder joints, and misbehaving electronics in the past. The point of this story is that, had this not been the solution, the device would have been e-waste, more so than it already is because even the most simple repair was magic to people unconcerned with technology. I've dedicated my life to technology and engineering, and even then I'm basically an idiot when it comes to a lot of things. Most people are idiots at most things and good at a couple things.
I understand people are upset when their technology stops working. It stops working because the people who are experts in it, don't have the time or funding to keep it working, and the people who want it to work don't understand how it works in the first place, because if they did and really wanted their old hardware working, they'd develop a driver for it. People do this all the time, and it takes months if not years of effort from a handful of contributors to even begin to match what a well funded team of experts can get done when it's their day job.
There's a fundamental disconnect between those who use and purchase technology and those who make it. The ones who make it are experts in tech and idiots at most things, and the ones who use it are likely experts in many things but idiots with tech, or simply have a day job they need to survive and don't have the time to reverse engineer these systems.
Even in an ideal communist state, resources need allocation, and the allocation would likely still trend towards supporting the latest version of things that currently have the tooling and resources, who's power consumption is lower and speed higher. We might get longer lifespans out of things if we don't require constant improvement to serve more ads, collect more data, and serve HD videos of people falling off things, but the new technology will always come around, making the old obsolete.
There's also a bit of an internet-commie brainworm that I'm still trying to pin down.
Your post is great & I love it in its entirety but I think this part kinda boils down to people thinking everything is capitalism's fault and sometimes things are just exacerbated by capitalism. As you acknowledge later, there are real challenges with developing and implementing technology not that it couldn't be done in a more responsible way and I think many people not in technical fields get jaded and stop believing this.
I have (for the last 24 hours) heard so many people say linux is dropping 32 bit support.
Some (most) distros have dropped support for 32 bit, and firefox stopped providing a 32 bit version for linux, but the kernel still very much supports 32bit. I believe there recently were some talks of cutting out some niche functionality for certain 32bit processors, but I've not heard anything about actually gutting 32bit support from the kernel.
Idk, I'm probably too invested in this. Internets got me going nuts. I should prolly touch grass.
FWIW this isnt hapenning right now. There is a gradual reorganization where 32bit i686 packages are being isolated. WoW Wine allows for emulating 32 bit packages using only 64 bit systems for example. Fedora will probably be the one to do this first, but this change is contingent on Steam (which is a 32 bit application on GNU and Windows).
There will always be specialized distributions that will always cater to 32 bit systems though. Also Debian is probably never going to drop i686 until it physically stops compiling.
I think there was some news recently about maintainers wanting to call 32bit support over. That might be why a lot of folks are talking about it.
Yep, its never just "upgrade your gpu". That's one bit. But for the new gpu to work, I've gotta get a new mobo, and the new mobo needs new ram and a new cpu. Basically replacing everything but the damn case, and sometimes that's gotta go too.
It's honestly a miracle any of this shit worked in the first place.
I recently got a deal on RAM, which I later realized required a new motherboard, which I later realized had wifi-7, which required forcefully updating my fine Windows-10 install to the yucky windows 11. Awful all around. Linux was fine but goddamn I bought the ram to improve some games.
It really is the intersection of capitalism that makes this so difficult. Really many computer users in the global south have much older hardware than is available in the imperial core, but a majority of development happens in the imperial core in terms of software maintenance and so there's a purposeful disconnect between manufacturers and users.
Libre operating systems have been a lot better at this, but we see the shortage of labor and expertise a lot and when hardware companies make bone headed moves like nvidia restricting their newer drivers to newer cards there's really nothing we can do on our side besides keep packaging legacy drivers.
It doesnt help that newer computer products have become more hostile to user repairs and have built in lifespans with soldered SSDs.
32 Bit computers are going to have to go away at some point though due to the 2038 problem (unless the intention is to not ever connect to the Internet on those devices anymore). So at the very least its a legimitimate hardware limitation.
IMO, this is less of a problem with the attitude of people and more of a problem of the logic of the whole computing sector. The idea that old parts will be deprecated in the future isn't just accepted, it's desired. The sector's goal is the fastest possible development, and development does happen. A graphics card from 15 years ago is, for all intents and purposes, a piece of electronic junk, 10 times slower and 5 times less energy efficient. This has to do with the extremely fast development of computer parts, which is the paradigm of the field.
In order to maintain 10-15 year-old parts, you'd have to stop or to slow the development speed of the sector. There are arguments to be made for that, especially from an environmental point of view, but as the sector is, you simply cannot expect there to be support for something working 10 times slower than contemporary counterparts.
10-15 years
Try 10-15 months
God, is it really that bad these days?
I should start posting screenshots of unreal engine games @640x480 70% resolution scale that I've tweaked into ultra potato mode, only to be able to play them at 24fps. 😅 I enjoy them too!
At the end of the day, it takes a lot of effort to keep hardware device drivers up-to-date to work with newer software. Of course hardware should be designed for long term support, and at least have enough documentation to make that possible. But it's also hard when there aren't many people in the "community" who even have certain hardware to test out. And it takes work to even support any given hardware device in the first place, especially without documentation. And if you're talking about graphics cards specifically, they're really complex compared to most hardware devices, and have changed a lot over the last 10-20 years. But with Linux, there just isn't the necessary labor to support so much stuff.
I understand at some point, certain hardware just has to be abandoned because next to nobody is using it.
My issue is more that on a cultural level people are just a tad too quick to tell somebody they are SOL and that they should buy new hardware.
My issue is more that on a cultural level people are just a tad too quick to tell somebody they are SOL and that they should buy new hardware.
In many cases it’s just the truth though. It’s the reality that Silicon Valley created for us. Yes it fucking sucks, but it’s capitalism of course it does.
I'm still using an amd 390x, which admittedly is almost 10 years old, but it still runs most almost-modern stuff pretty well. But I have started to run into a few games where it just straight up won't play the game at all because it doesn't support dx12 or something (looking at you, marvel rivals ) . Stopped getting updated drivers like 5 years back lol.
i just play chess on browser and i have this old hp laptop with intel core i5 vpro tm according to the sticker and my laptop except the dead battery is still usable also it has partial passkey and experimental uefi support . except passkey i am using experimental uefi with no problem on linux and bsd and i think i am going to use this laptop for more 5 yr . idk what obsolete you are talking about . blender , obs and godot works on my system and on windows blender and obs showed errors due to hardware requirement
I think its great that your hardware works for you! My point was more concerning people who encounter an issue with older hardware (ie a driver that is no longer included in the kernel, people on 32bit systems, people on low memory systems, etc) being told to "just upgrade".
Hardware just gets better really quickly, GPU's have been roughly doubling in power every two generations. Why wouldn't developers take advantage of that to make games that look amazing?
Honestly though, it's mostly just a gaming problem. I have, among other computers, a 16 year old laptop that does perfectly fine for everything that isn't running modern videogames (and even then I can still run older titles). Sure, 4GB RAM means I sometimes have to do things one at a time, but old hardware works surprisingly well. x86_64 came out 22 years ago, so if anything it's surprising that so much is only dropping 32 bit support now.
Phones, on the other hand, are a much more abysmal landscape than computers. A lot of Androids will only get like 3 years of updates, and even the longer supported phones like iPhones and Google Pixels will get a mere 6-8 years of updates.
I'm impressed your laptop is in such great shape. I had a decently high end Dell in 2010 that by 2018 could barely browse the web, overheated without a fan underneath, and had a battery that wouldn't hold a percentage of charge.
Tbh it was pretty high end back in it's day, and it's had it's battery replaced.
I would simply not play video games. For everything else, my 15yo ThinkPads are plenty snappy.
$250 ish is the median monthly wage in my country. i always cringe when they say this.
My first server was setup on decade old hardware and ran like a champ. Old hardware is great if your don't get hung up over graphs trying to tell you it's not
technology
On the road to fully automated luxury gay space communism.
Spreading Linux propaganda since 2020
- Ways to run Microsoft/Adobe and more on Linux
- The Ultimate FOSS Guide For Android
- Great libre software on Windows
- Hey you, the lib still using Chrome. Read this post!
Rules:
- 1. Obviously abide by the sitewide code of conduct. Bigotry will be met with an immediate ban
- 2. This community is about technology. Offtopic is permitted as long as it is kept in the comment sections
- 3. Although this is not /c/libre, FOSS related posting is tolerated, and even welcome in the case of effort posts
- 4. We believe technology should be liberating. As such, avoid promoting proprietary and/or bourgeois technology
- 5. Explanatory posts to correct the potential mistakes a comrade made in a post of their own are allowed, as long as they remain respectful
- 6. No crypto (Bitcoin, NFT, etc.) speculation, unless it is purely informative and not too cringe
- 7. Absolutely no tech bro shit. If you have a good opinion of Silicon Valley billionaires please manifest yourself so we can ban you.