8

Hi,

I have a friend who is looking to run a few simulations he has implemented in python and needs around 256GB of ram. He is estimating it will take a couple of hours, but he is studying economics so take that with a grain of salt 🤣

For this instance, I recommended GCP, but I felt a bit dirty doing that. So, I was wondering if any of you have a buttload of memory he can burrow? Generally, would you lend your RAM for a short amount of time to a stranger over the internet? (assuming internet acccess is limited to a signle ssh port, other necessary safeguards are in place)

top 22 comments
sorted by: hot top controversial new old
[-] HelloRoot@lemy.lol 6 points 1 month ago* (last edited 1 month ago)

Why not get a 0.5 or 1 tb nvme ssd and set it all as swap?

It will run probably 10 times slower, but it's cheap and doable.

[-] dgdft@lemmy.world 3 points 1 month ago* (last edited 1 month ago)

This is the way.

Depending on the nature of the sim, it could probably even be done with ~80 GB or less of existing SSD space using zram w/ zstd.

[-] cevn@lemmy.world 5 points 1 month ago

Needing that much RAM is usually a red flag that the algo is not optimized.

[-] scrubbles@poptalk.scrubbles.tech 8 points 1 month ago

Researchers always make some of the worst coders unfortunately.

Scientists, pair up with an engineer to implement your code. You'll thank yourself later.

[-] DaPorkchop_@lemmy.ml 5 points 1 month ago

True, but there are also some legitimate applications for 100s of gigabytes of RAM. I've been working on a thing for processing historical OpenStreetMap data and it is quite a few orders of magnitude faster to fill the database by loading the 300GiB or so of point data into memory, sorting it in memory, and then partitioning and compressing it into pre-sorted table files which RocksDB can ingest directly without additional processing. I had to get 24x16GiB of RAM in order to do that, though.

[-] markstos@lemmy.world 0 points 1 month ago

Nope. Some algorithms are fastest when a whole data set is held into memory. You could design it to page data in from disk as needed, but it would be slower.

OpenTripPlanner as an example will hold the entire road network of the US in memory for example for fast driving directions, and it uses the amount of RAM in that ballpark.

[-] cevn@lemmy.world 1 points 1 month ago

Sure, that is why I said usually. The fact that 2 people replied with the same OpenStreetMap data set is kinda proving my point.

Also, do you need the entire US road system in memory if you are going somewhere 10 minutes away? Seems inefficient, but I am not an expert here. I guess it is one giant graph, if you slice it up, suddenly there are a bunch of loose ends that break the navigation.

[-] markstos@lemmy.world 1 points 1 month ago

I host routing for customers across the US, so yes I need it all. There are ways to solve the problem with less memory but the point is that some problems really do require a huge amount of memory because of data scale and performance requirements.

[-] maxwellfire@lemmy.world 4 points 1 month ago* (last edited 1 month ago)

That's kinda an insane amount of ram for most simulations. Is this like a machine learning thing? Is his python code just super unoptimized? Is it possible he's making a bunch of big objects and then not freeing the references when he's done with them so they're never garbage collected?

[-] rumba@lemmy.zip 3 points 1 month ago

AWS has an r4.8xlarge 244gb ram with 32 vcores for $2.13 an hour If they can handle Linux. $2.81 an hour for windows.

[-] squaresinger@lemmy.world 3 points 1 month ago

First, define what you are asking for.

Do you want someone to send you a cardboard box full of RAM? Then forget it. Nobody would be stupid enough to lend that much expensive hardware to someone on the internet.

Or are you asking for someone to let you run random code on their PC for a few hours? Then forget it. Nobody would be stupid enough to open "a single SSH port" to someone on the internet to run potential malware on their PC.

That's exactly what cloud platforms are there for, and if you don't like google, get any other cloud provider.

[-] Prime@lemmy.sdf.org 3 points 1 month ago

Apply for compute time at a university cluster. It is free and usually easy.

[-] irmadlad@lemmy.world 2 points 1 month ago

The computer I'm typing on has 96 gb ram. Most of my equipment is ancient in terms of PCs. This one I build about 14 years ago, and I fully stocked it with the cutting edge tech of the day. My intent was to build a LTS PC, as it were. LOL Back then, SLI was the thing, but I've upgraded the GPU. I have some old stuff in the parts bin tho, but it's ancient as well.

[-] WhyJiffie@sh.itjust.works 1 points 1 month ago

that's probably way too much for any sane Python algorithm. if they can't run it, how do they even know how much is needed?

Probably they should only make a prototype in Python, and then reimplement it in a compiled language. it should reduce the resource usage massively

[-] metaStatic@kbin.earth 1 points 4 weeks ago

got a bag full of SIMMs, probably not a whole buttload but I don't think even that amount would add up to 256GB

[-] Railcar8095@lemmy.world 1 points 1 month ago

Tell your friend to open source the algorithm.. Somebody will surely point at a easy optimization. 100 others will just shit on your friend

Borrow it from NewEgg, then return it

[-] OhVenus_Baby@lemmy.ml 3 points 1 month ago

Newegg isn't so bad. Do a shit corporate like best buy.

[-] trk@aussie.zone 1 points 1 month ago

I've got 512GB of RAM in my server, and 128GB of RAM on my desktop cause you can never have too much.

[-] Outwit1294@lemmy.today 0 points 1 month ago
[-] HelloRoot@lemy.lol 1 points 1 month ago* (last edited 1 month ago)

put your butt on a scale, convert the result to RAM, duh

[-] paraphrand@lemmy.world 2 points 1 month ago* (last edited 1 month ago)

Yup, that’s some random ass memory.

this post was submitted on 01 Jul 2025
8 points (100.0% liked)

Selfhosted

50057 readers
77 users here now

A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don't control.

Rules:

  1. Be civil: we're here to support and learn from one another. Insults won't be tolerated. Flame wars are frowned upon.

  2. No spam posting.

  3. Posts have to be centered around self-hosting. There are other communities for discussing hardware or home computing. If it's not obvious why your post topic revolves around selfhosting, please include details to make it clear.

  4. Don't duplicate the full text of your blog or github here. Just post the link for folks to click.

  5. Submission headline should match the article title (don’t cherry-pick information from the title to fit your agenda).

  6. No trolling.

Resources:

Any issues on the community? Report it using the report flag.

Questions? DM the mods!

founded 2 years ago
MODERATORS