While moving from one nest to another (we're lemmings here; RP it a bit) I realized I still have all computers I ever bought or assembled, except for those that literally broke beyond any hope of repair.
Some are no longer used daily but all work and being on a point in life where everything and anything in the nest needs to have a purpose or a function, led me think what actually renders a computer useless or truly obsolete.
I was made even more aware of this, as I'm in the market to assemble a new machine and I'm seeing used ones - 3 or 4 years old - being sold at what can be considered store price, with specs capable of running newly released games.
Meanwhile, I'm looking at two LGA 775 motherboards I have and considering how hard can I push it before it spontaneously combusts to make any use of it, even if only a type writer.
So, per the title, what makes a computer obsolete or simply unusable to you?
Addition
So I felt necessary to update the post and list the main reasons surfacing for rendering a machine obsolete/unusable
overall and consumption vs computational power
Linux rule!
At the physical level: capacitors age and blow up, batteries stop charging.
At the efficiency level: when the work you want to do uses more energy on an older platform than on a newer platform.
At the convenience level: when the newer device is so convenient you never use the old device, telephone versus desktop as an example for most people.
After reliability level: if you're constantly replacing things on a unit, where it becomes your part-time job.
The longest used devices tend to be embedded industrial devices. They have a job they keep doing that job and until they break they're going to do that job forever. And that's application specific computing.
Most home users are general computer users, so they have a mix of different requirements, support and use cases. I for one still use a 10-year-old laptop. And it's totally fine.