view the rest of the comments
Ask Lemmy
A Fediverse community for open-ended, thought provoking questions
Please don't post about US Politics. If you need to do this, try !politicaldiscussion@lemmy.world
Rules: (interactive)
1) Be nice and; have fun
Doxxing, trolling, sealioning, racism, and toxicity are not welcomed in AskLemmy. Remember what your mother said: if you can't say something nice, don't say anything at all. In addition, the site-wide Lemmy.world terms of service also apply here. Please familiarize yourself with them
2) All posts must end with a '?'
This is sort of like Jeopardy. Please phrase all post titles in the form of a proper question ending with ?
3) No spam
Please do not flood the community with nonsense. Actual suspected spammers will be banned on site. No astroturfing.
4) NSFW is okay, within reason
Just remember to tag posts with either a content warning or a [NSFW] tag. Overtly sexual posts are not allowed, please direct them to either !asklemmyafterdark@lemmy.world or !asklemmynsfw@lemmynsfw.com.
NSFW comments should be restricted to posts tagged [NSFW].
5) This is not a support community.
It is not a place for 'how do I?', type questions.
If you have any questions regarding the site itself or would like to report a community, please direct them to Lemmy.world Support or email info@lemmy.world. For other questions check our partnered communities list, or use the search function.
Reminder: The terms of service apply here too.
Partnered Communities:
Logo design credit goes to: tubbadu
My guess is that it's a couple watts while you're actively using the internet, mostly due to the extra CPU load a few bad ads cause when they're on your screen. Without having done the math I expect all the servers, data transfer etc. to be negligible, on a per-user basis, because they serve so many users.
That's another interesting thing btw. Most of the "internet thing X uses Y amount of electricity" are utter bullshit and massively exaggerating. What uses most power on desktop/TV is the screen. The second biggest consumer is likely your router (which is on whether you use it or not, but the studies usually ascribe all of the standby usage to your active usage - this makes sense if you try to look at "how much CO2 does all our digital stuff including 'having an Internet connection' cause" but not if you're trying to look at "how much extra CO2 does activity X cause, assuming I already have an internet connection because I'm not gonna live in a cave").
Don't the fans use a lot of power? And wouldn't a datacentre or server need a lot of cooling?
The server uses a kilowatt of power or more (most of it in the CPU). But if the server is serving 1000 active users concurrently, and only 5% of the time you spend online is spent fetching ads, 20000 people staring at their screens get their ads from let's say 2 kW of server power usage, plus another 2 kW for all the equipment to get the data there... for a total of 0.4 watts per user.
These are completely eyeballed numbers, and could easily be off by an order of magnitude.
But your on premise gear (screen, computer, router) are likely by far the biggest factor.
One easy way to cross-check power usage claims is cost. It will only catch the most egregious bullshit, but it's easy. A random page I found claims that "According to the American Council for an Energy-Efficient Economy it takes 5.12 kWh of electricity per gigabyte of transferred data."
A Steam game with 50 GB would thus consume 256 kWh. Even if your 300 watt idle gaming rig, 50 Watt Router and 150 watt screen to watch the progress bar spends 2 hours downloading that, that's 1 kWh. Even at 8 cents per kWh, that means just downloading the game would cost someone (not you) over $20. Do you think steam would let you delete and redownload that game that you bought on sale for $10 as much as you want if between them and your ISP someone had to pay for $20 just in electricity, each time? Not the game rights, not the servers, not the connection, just power.
Thanks, this makes a lot of sense.
Keep in mind, one of the reasons we use data centers because cooling one big room of computers is cheaper than cooling 200 small rooms with computers.