While moving from one nest to another (we're lemmings here; RP it a bit) I realized I still have all computers I ever bought or assembled, except for those that literally broke beyond any hope of repair.
Some are no longer used daily but all work and being on a point in life where everything and anything in the nest needs to have a purpose or a function, led me think what actually renders a computer useless or truly obsolete.
I was made even more aware of this, as I'm in the market to assemble a new machine and I'm seeing used ones - 3 or 4 years old - being sold at what can be considered store price, with specs capable of running newly released games.
Meanwhile, I'm looking at two LGA 775 motherboards I have and considering how hard can I push it before it spontaneously combusts to make any use of it, even if only a type writer.
So, per the title, what makes a computer obsolete or simply unusable to you?
Addition
So I felt necessary to update the post and list the main reasons surfacing for rendering a machine obsolete/unusable
overall and consumption vs computational power
Linux rule!
I have this "rule" which might be a bit old, that 1 watt a year costs roughly 1€ (it's just getting worse).
So over say 5 years (a somewhat reasonable time today I think), your 180watt PC used 8h/d would cost 300€ in power usage.
An older PC with a power hungry GPU could use 400 watts => 666€.
A ThinkPad (ok, it has not a gaming GPU) would be like 50€ and a good used one can be had for 200-300€.
You can also get a 4 to 8 gen Dell tower for 40-140€, add a cheap GPU and you'll have a Roblox / even Minecraft PC.
If you buy a brand new PC yes that won't (most probably) be an economical investion concerning power use. But old PCs suck(draw) power and one day it's probably economically viable to change it for a mor recent one.
Which reasonable PC uses 400W?