My dad was a teacher, his subject was computers, at that time "computers" class was heavily programming. Basic stuff.
It seems that kids from gen x, and the millennial generation had the timing to learn the tech before it "just works", so we're used to figuring it out as we go, because there was no way to look it up on the internet, so we had to.
The zoomers and younger generations are largely "it just works" users, where all the basics of getting things to just plug and play was a thing. If it didn't work it was either "incompatible" or broken. So don't try to make it work, or you'll be sued for DMCA related violations.
IMO, there's a sweet spot, somewhere in the late 70's or early 80's to about the early-mid 2000's when people had to know something about tech to operate it. Anyone with the aptitude for tech, who was born during this time is generally working in tech.
People born before that are generally the old school pen and paper types, and anyone younger is generally the plug and play digital era.
If course, everyone is different, so the dates are probably liable to be different depending on the area, and each person may have different motivations, etc.
My generation (early millennials) are generally known for being the "tech" person to friends/family, and ADHD; at least, as far as I can see, from my little bubble of friends who mostly work in/with tech.
And this is why I'm teaching my kids computer stuff. We haven't gotten too crazy with it, but my kids have built some stuff in Scratch and helped me assemble my PC (they'll assemble their own) with me explaining what the main bits do. I also intend to do some basic Arduino-type stuff w/ them as well once I get started w/ home automation (have a NAS and some apps, but no sensors or anything cool like that).
They'll probably never need that knowledge, but having the ability to reason about a problem using some foundational knowledge should be useful regardless of what they do (i.e. why isn't this working? I'll check the wires, run a simpler test, etc).
I don't think that's a bad thing. We made it easier, and they're reaping the benefits of our work.
The only issue I see is that when it breaks, nobody will know how to fix it, since we've abstracted all the complexity away from the users, so they don't understand the underlying processes that need to work for the thing to function.
That's a pretty big issue, and that's likely a huge contributor to issues like ewaste. If someone owns a computer and the memory goes bad, they buy a new computer instead of new RAM. Likewise with batteries on phones, capacitors on appliance circuit boards, etc. There's so much that used to be regularly repairable that could still be repairable if people understood the basics of the tech they use. But when it stops working, the knee-jerk reaction is to replace it, not repair it, esp. when it's generally cheaper to replace than have a service tech come out (when 50 years ago, many would just repair it themselves using the provided service manual).
I like to blame manufacturers here, but a large part of me has to acknowledge that a lot of people wouldn't bother even if they had all the documentation readily available. A little bit of knowledge about how things actually work can go a long way in reducing waste throughout society.
Yep, pretty much this. I grew up with computers. The first one I used was a C64 in school. We got our first family PC in 1996. I was 14 back then.
If you wanted to do basically anything, you had to figure it out or read an actual manual. We had to fight with drivers and such in order to get any game or device working. It was part of the fun; you had to be nerdy to want to do that.
Nowadays, even my completely tech illiterate dad can use an iPad to browse, e-mail, stream stuff and connect on social media.
To be clear: my dad phoned me this morning asking how he could set the time on his digital Casio watch. And he’s using an iPad!! That’s how easy we were able to make tech, so even a toddler can use it.
I feel very lucky that I grew up with tech and can solve most problems on my own.
Yep. With my dad teaching computers, we always had one in the house. I started on DOS, and I've used most versions of Microsoft operating systems since then.
I've built computers, upgraded, modified, tweaked and nerded out over low level settings and optimizations....
At this point, I can do all of that. I choose to simply buy something off a shelf because I can't be bothered to do everything that's needed to get my system working perfectly. Someone else has done the engineering to make their PC's operate efficiently, so I'll just let them do the hard work, and pay slightly more for my system so I don't have to think about it.
Once the warranty is up, and something goes wrong, I'll be in there with a multimeter and soldering iron to fix it if I have to....
When I went to install a suite of emulators on the Steam Deck, it was one installer and some light configuration. Installing ROMs involved using an app that automatically digs up the box art and adds console collections to the Steam interface.
All of this largely just worked.
As a millenial that was wild. I've never trusted things to just work before but a bunch of open-source devs made it happen. That's what made me realize we live in different times (and why newer generations have no idea how to actually use computers).
I've used most versions of Windows since 3.11
I didn't bother going backwards because as far as I'm concerned, before 3.11, it was better to use DOS.
Since then I've used 95, 98, ME, 2000, XP, Vista, 7, 8, 10, and of course 11.
About the only one I "missed" was NT, and I'm not unhappy about that. My notes are: 3.11 was basically just an application running on DOS, which was fine, but it's not really improving much. Few applications supported Windows at that point, so there was little reason to have/use it. 95 was hot garbage at launch, and did not improve much over time, however it was such a drastic change from DOS/3.11 that it was the best we could have hoped for at the time. 98 was forgettable, very little improvement over 95; at least until 98 SE came out, adding USB support, which changed a lot of things. ME was fine for the most part, they put to much emphasis on making it look better without making significant improvements beyond that; however, ME was fine and stable after a few service packs.
XP was the favorite for most, I saw it as Windows 2000 with makeup. That said, the biggest improvement here was the change over to the NT kernel, something we still use today. Windows 2000 was a favorite of mine, it was visually simpler than ME/XP, but all the functionality you needed was there. It was fairly barebones but that allowed for Windows to take a back seat to whatever you were actually using the computer for.
Vista was hated, but not because it was actually bad. The problem with Vista was that the system requirements to run Windows shot up significantly with Aero. At the same time, Microsoft introduced driver updates for security, so many older devices, built for XP, that were more or less abandoned, never got drivers that met the security constraints added in Vista. Vista also launched around the netbook era, when "a computer for every child" was a thing. The hardware was trending towards less powerful, cheaper chips, while Vista was requiring much more from the hardware, creating a perfect storm of people buying Celeron systems pre-installed with Vista and having a very bad time. Anyone using a Core/Core2/first gen Core I* chip had a lot fewer problems.
When Windows 7 launched, most people had abandoned Celeron as a product, and most hardware manufacturers were distributing drivers with the extra security needed for Vista (which was also required for 7), so everything went smoothly and 7 became the next favorite. I don't have any complaints with 7, and I would be happy to keep using Windows 7 if it wasn't for the fact that it's abandonware.
Windows 8 was a solution looking for a problem. This was the era of Android honeycomb, the odd version of Android made exclusively for tablets. Microsoft seemed to think it was a good idea to do the same, however, sales of tablet windows systems are fairly paltry overall, so forcing everyone into a tablet optimized interface proved to be a bad idea, they "fixed" it with 8.1, and nobody cared. I had purchased a Microsoft surface pro 3 at the time, which was pre-installed with Windows 8, and I found that it was fine, but it was both a lackluster tablet, and a fairly bad laptop, it was an inbetween hybrid that was (again) a solution looking for a problem. Despite having one of the "more powerful" pro 3 units (I think I had the second from the top SKU, core i5), it frequently overheated, making it uncomfortable to use as a tablet, and due to thermal throttling, it was not performant as a laptop. It was a nice idea, executed poorly, solving a problem that nobody had.
10, in my opinion, is the gold standard. At least, until they started loading windows up with spyware. Any tracking, advertising ID garbage, or similar, was basically the worst part of Windows 10, and everything else was essentially a return to form and function for many things. To me it was like an evolution of Windows 2000. Not many frills, and windows mostly fades into the background so you can focus on what you're trying to accomplish.
11 is trying to overhaul your experience, and doing so badly. Control panel, apps, and even your right-click menu is being done differently... They're pushing you to do it the "new" Microsoft way, and so far, I haven't met anyone that prefers anything that way.
IMO, 11 is a lot of Microsoft shoving terrible options in your face by default and whispering in your ear "you know you like it like that"
No, we don't. Fuck off with your bullshit, fuck "new" teams, fuck "new" Outlook, fuck everything you're slapping a "new" label on. We don't want this.
Windows 11 is the best advertisement for Linux and Mac products so far.
My housemate is completely incapable of installing mods without using a mod manager, so when my subscription to vortex lapsed he wanted my help and I was like, "look.... just read the fucking instructions man, odds are it'll tell you exactly what to do"
I work in tech.
My dad was a teacher, his subject was computers, at that time "computers" class was heavily programming. Basic stuff.
It seems that kids from gen x, and the millennial generation had the timing to learn the tech before it "just works", so we're used to figuring it out as we go, because there was no way to look it up on the internet, so we had to.
The zoomers and younger generations are largely "it just works" users, where all the basics of getting things to just plug and play was a thing. If it didn't work it was either "incompatible" or broken. So don't try to make it work, or you'll be sued for DMCA related violations.
IMO, there's a sweet spot, somewhere in the late 70's or early 80's to about the early-mid 2000's when people had to know something about tech to operate it. Anyone with the aptitude for tech, who was born during this time is generally working in tech.
People born before that are generally the old school pen and paper types, and anyone younger is generally the plug and play digital era.
If course, everyone is different, so the dates are probably liable to be different depending on the area, and each person may have different motivations, etc.
My generation (early millennials) are generally known for being the "tech" person to friends/family, and ADHD; at least, as far as I can see, from my little bubble of friends who mostly work in/with tech.
Yup, agree with this.
And this is why I'm teaching my kids computer stuff. We haven't gotten too crazy with it, but my kids have built some stuff in Scratch and helped me assemble my PC (they'll assemble their own) with me explaining what the main bits do. I also intend to do some basic Arduino-type stuff w/ them as well once I get started w/ home automation (have a NAS and some apps, but no sensors or anything cool like that).
They'll probably never need that knowledge, but having the ability to reason about a problem using some foundational knowledge should be useful regardless of what they do (i.e. why isn't this working? I'll check the wires, run a simpler test, etc).
But do they have to set jumpers on the motherboard to choose the processor voltage?
And make sure the IRQs on their sound card and printer don't conflict ?
"your sound card works perfectly"
It really whips the llama's ass
I think they open sourced that recently... I should take a look.
Kids these days don't know how good they have it...
They probably never will.
I don't think that's a bad thing. We made it easier, and they're reaping the benefits of our work.
The only issue I see is that when it breaks, nobody will know how to fix it, since we've abstracted all the complexity away from the users, so they don't understand the underlying processes that need to work for the thing to function.
Other than that, it just works.
That's a pretty big issue, and that's likely a huge contributor to issues like ewaste. If someone owns a computer and the memory goes bad, they buy a new computer instead of new RAM. Likewise with batteries on phones, capacitors on appliance circuit boards, etc. There's so much that used to be regularly repairable that could still be repairable if people understood the basics of the tech they use. But when it stops working, the knee-jerk reaction is to replace it, not repair it, esp. when it's generally cheaper to replace than have a service tech come out (when 50 years ago, many would just repair it themselves using the provided service manual).
I like to blame manufacturers here, but a large part of me has to acknowledge that a lot of people wouldn't bother even if they had all the documentation readily available. A little bit of knowledge about how things actually work can go a long way in reducing waste throughout society.
Yep, pretty much this. I grew up with computers. The first one I used was a C64 in school. We got our first family PC in 1996. I was 14 back then.
If you wanted to do basically anything, you had to figure it out or read an actual manual. We had to fight with drivers and such in order to get any game or device working. It was part of the fun; you had to be nerdy to want to do that.
Nowadays, even my completely tech illiterate dad can use an iPad to browse, e-mail, stream stuff and connect on social media.
To be clear: my dad phoned me this morning asking how he could set the time on his digital Casio watch. And he’s using an iPad!! That’s how easy we were able to make tech, so even a toddler can use it.
I feel very lucky that I grew up with tech and can solve most problems on my own.
Yep. With my dad teaching computers, we always had one in the house. I started on DOS, and I've used most versions of Microsoft operating systems since then.
I've built computers, upgraded, modified, tweaked and nerded out over low level settings and optimizations....
At this point, I can do all of that. I choose to simply buy something off a shelf because I can't be bothered to do everything that's needed to get my system working perfectly. Someone else has done the engineering to make their PC's operate efficiently, so I'll just let them do the hard work, and pay slightly more for my system so I don't have to think about it.
Once the warranty is up, and something goes wrong, I'll be in there with a multimeter and soldering iron to fix it if I have to....
When I went to install a suite of emulators on the Steam Deck, it was one installer and some light configuration. Installing ROMs involved using an app that automatically digs up the box art and adds console collections to the Steam interface.
All of this largely just worked.
As a millenial that was wild. I've never trusted things to just work before but a bunch of open-source devs made it happen. That's what made me realize we live in different times (and why newer generations have no idea how to actually use computers).
BORN INTO WINDOWS 3, STAYED FOR 7, FORCED TO 11.
I've used most versions of Windows since 3.11 I didn't bother going backwards because as far as I'm concerned, before 3.11, it was better to use DOS. Since then I've used 95, 98, ME, 2000, XP, Vista, 7, 8, 10, and of course 11.
About the only one I "missed" was NT, and I'm not unhappy about that. My notes are: 3.11 was basically just an application running on DOS, which was fine, but it's not really improving much. Few applications supported Windows at that point, so there was little reason to have/use it. 95 was hot garbage at launch, and did not improve much over time, however it was such a drastic change from DOS/3.11 that it was the best we could have hoped for at the time. 98 was forgettable, very little improvement over 95; at least until 98 SE came out, adding USB support, which changed a lot of things. ME was fine for the most part, they put to much emphasis on making it look better without making significant improvements beyond that; however, ME was fine and stable after a few service packs.
XP was the favorite for most, I saw it as Windows 2000 with makeup. That said, the biggest improvement here was the change over to the NT kernel, something we still use today. Windows 2000 was a favorite of mine, it was visually simpler than ME/XP, but all the functionality you needed was there. It was fairly barebones but that allowed for Windows to take a back seat to whatever you were actually using the computer for.
Vista was hated, but not because it was actually bad. The problem with Vista was that the system requirements to run Windows shot up significantly with Aero. At the same time, Microsoft introduced driver updates for security, so many older devices, built for XP, that were more or less abandoned, never got drivers that met the security constraints added in Vista. Vista also launched around the netbook era, when "a computer for every child" was a thing. The hardware was trending towards less powerful, cheaper chips, while Vista was requiring much more from the hardware, creating a perfect storm of people buying Celeron systems pre-installed with Vista and having a very bad time. Anyone using a Core/Core2/first gen Core I* chip had a lot fewer problems.
When Windows 7 launched, most people had abandoned Celeron as a product, and most hardware manufacturers were distributing drivers with the extra security needed for Vista (which was also required for 7), so everything went smoothly and 7 became the next favorite. I don't have any complaints with 7, and I would be happy to keep using Windows 7 if it wasn't for the fact that it's abandonware.
Windows 8 was a solution looking for a problem. This was the era of Android honeycomb, the odd version of Android made exclusively for tablets. Microsoft seemed to think it was a good idea to do the same, however, sales of tablet windows systems are fairly paltry overall, so forcing everyone into a tablet optimized interface proved to be a bad idea, they "fixed" it with 8.1, and nobody cared. I had purchased a Microsoft surface pro 3 at the time, which was pre-installed with Windows 8, and I found that it was fine, but it was both a lackluster tablet, and a fairly bad laptop, it was an inbetween hybrid that was (again) a solution looking for a problem. Despite having one of the "more powerful" pro 3 units (I think I had the second from the top SKU, core i5), it frequently overheated, making it uncomfortable to use as a tablet, and due to thermal throttling, it was not performant as a laptop. It was a nice idea, executed poorly, solving a problem that nobody had.
10, in my opinion, is the gold standard. At least, until they started loading windows up with spyware. Any tracking, advertising ID garbage, or similar, was basically the worst part of Windows 10, and everything else was essentially a return to form and function for many things. To me it was like an evolution of Windows 2000. Not many frills, and windows mostly fades into the background so you can focus on what you're trying to accomplish.
11 is trying to overhaul your experience, and doing so badly. Control panel, apps, and even your right-click menu is being done differently... They're pushing you to do it the "new" Microsoft way, and so far, I haven't met anyone that prefers anything that way.
IMO, 11 is a lot of Microsoft shoving terrible options in your face by default and whispering in your ear "you know you like it like that"
No, we don't. Fuck off with your bullshit, fuck "new" teams, fuck "new" Outlook, fuck everything you're slapping a "new" label on. We don't want this.
Windows 11 is the best advertisement for Linux and Mac products so far.
My housemate is completely incapable of installing mods without using a mod manager, so when my subscription to vortex lapsed he wanted my help and I was like, "look.... just read the fucking instructions man, odds are it'll tell you exactly what to do"
The old RTFM technique, classic.