2.4GHz wifi is not suitable for two big reasons, interference and low bandwidth. 2.4GHz wifi in any kind of suburban or city environment and sometimes even in rural will be congested with other networks, microwaves, other appliances, etc causing massive speed degradation or fluctuations. The range of 2.4GHz is just too large for all the equipment that uses it in today's world. In my previous apartment complex for example my phone could see 35 distinct 2.4GHz wifi networks while only 3 at max can operate without interfering with each other. In that same building i could only see 13 5GHz networks. Which brings me to the second issue of bandwidth
2.4GHz at least here in the US only has channels 1, 6, and 11 that will not interfere with each other. if anyone puts their network between these three channels it will knock out both the one below and the one above. Channel 3 would interfere with both channels 1 and 6 for example. By going up to 5GHz you have many more free channels, fewer networks competing for those channels, and higher bandwidth channels allowing for much higher throughput. 2.4GHz allows 40MHz wide channels which in isolation would offer ~400mbps, but you will never see that in the real world.
Personally, i think OEMs should just stop including it or have it disabled by default and only enable it in an "advanced settings" area.
Edit: I am actually really surprised at how unpopular this opinion appears to be.
You sound like a USA citizen. There many places in the world where walls are made of concrete. 5Ghz doesn't penetration concrete.
In such cases, the only way to get 5GHz into every room will be passing cat5 cable in the wall and placing an AP.
Passing a cable in concrete walls requires a pipe in the wall, that was placed there when the house was built! But in many cases, the tunnels that exists are too narrow for cat5 and are already in use anyway.
So to fulfill your idea and still have WiFi we will need to raze to the ground whole cities and rebuild them.
Unless you are footing the bill, and take care of the CO2 emissions, just learn to disable 2.4GHz on your router.
CAT5 is essentially dead. Highly recommended to use cat6/e as a minimum, or cat8. The world is beginning to switch to multi gig ethernet and CAT5 is simply insufficient for that.
Yes it will work at gigabit speeds and most things you do will not require more than gigabit but who knows what we will be running in 10 years and cat 6 can handle 10 gig over a pretty good distance which should be sufficient until it needs to be completely replaced.
That being said, unless you are currently running a multi gig ethernet setup and are running into bandwidth limitations on CAT5 or cat5e, there is no need to pull and replace what is already there. This advice is for new deployments.
I agree with the sentiment, but I think cat5 is enough for at home deployment. My edge device isn't using 1Gb now, and it won't use 10 in ten years. Mostly because it may be cheaper to replace when needed than to deploy for future proofing.
For offices and such I agree, as the disruption of work for a few days may cost more than future proofing the net.