45
you are viewing a single comment's thread
view the rest of the comments
[-] BlameThePeacock@lemmy.ca -3 points 2 days ago

I mean, the text on a website isn't the problem for not being able to use 56k.

It's only images and video that take up space, the libraries used on websites are all cached at this point so that's hardly relevant to ongoing usage of a website.

[-] ripcord@lemmy.world 3 points 1 day ago

You're underestimating the text part a lot. The sheer amount of things downloaded for most sites is insane. It's not the raw data (although it's still pretty significant, especially when things haven't been cached yet.

But there's often HUNDREDS of pages loaded. Each which needs a GET, even to validate cache, which often fails. Some can be done in parallel. All requires a bunch of shitty slow HTML/CSS/JS compute. It's stupid. It's why loading a page on a 16-core system with gigabit internet links still takes like 5+ seconds to load instead of the like 200ms that it should. Which adds up.

Now do that over 56k.

this post was submitted on 13 Dec 2025
45 points (89.5% liked)

Programming

23894 readers
336 users here now

Welcome to the main community in programming.dev! Feel free to post anything relating to programming here!

Cross posting is strongly encouraged in the instance. If you feel your post or another person's post makes sense in another community cross post into it.

Hope you enjoy the instance!

Rules

Rules

  • Follow the programming.dev instance rules
  • Keep content related to programming in some way
  • If you're posting long videos try to add in some form of tldr for those who don't want to watch videos

Wormhole

Follow the wormhole through a path of communities !webdev@programming.dev



founded 2 years ago
MODERATORS