466
submitted 3 months ago by sag@lemm.ee to c/opensource@lemmy.ml
you are viewing a single comment's thread
view the rest of the comments
[-] princessnorah@lemmy.blahaj.zone 2 points 3 months ago

It really depends. Once every 1-5 minutes, sure, maybe. Once every 1-5 hours tho? You're likely fine.

[-] MangoPenguin@lemmy.blahaj.zone 3 points 3 months ago* (last edited 3 months ago)

True, although once per hour would still be a lot of data.

For example me running a fast.com test uses about 1.5GB of data to run a single test, so around 1TB per month if ran hourly.

[-] princessnorah@lemmy.blahaj.zone 1 points 3 months ago

Once every 6hrs would only be 180GB. A script that does it every six hours, but then increases the frequency if it goes below a certain threshold, could work well. I guess it all depends on how accurate you need the data to be.

this post was submitted on 12 Aug 2024
466 points (98.5% liked)

Open Source

31217 readers
192 users here now

All about open source! Feel free to ask questions, and share news, and interesting stuff!

Useful Links

Rules

Related Communities

Community icon from opensource.org, but we are not affiliated with them.

founded 5 years ago
MODERATORS