98
submitted 8 months ago by aCosmicWave@lemm.ee to c/asklemmy@lemmy.ml

Assuming our simulation is not designed to auto-scale (and our Admins don’t know how to download more RAM), what kind of side effects could we see in the world if the underlying system hosting our simulation began running out of resources?

you are viewing a single comment's thread
view the rest of the comments
[-] kromem@lemmy.world 5 points 8 months ago* (last edited 8 months ago)

The assumption that it isn't designed around memory constraints isn't reasonable.

We have limits on speed so you can't go too fast leading to pop in.

As you speed up the slower things move so there needs to be less processing in spite of more stuff (kind of like a frame rate drop but with a fixed number of frames produced).

As you get closer to more dense collections of stuff the same thing happens.

And even at the lowest levels, the conversion from a generative function to discrete units to track stateful interactions discards the discrete units if the permanent information about the interaction was erased, indicative of low level optimizations.

The scale is unbelievable, but it's very memory considerate.

this post was submitted on 19 Feb 2024
98 points (85.0% liked)

Asklemmy

43859 readers
1938 users here now

A loosely moderated place to ask open-ended questions

Search asklemmy 🔍

If your post meets the following criteria, it's welcome here!

  1. Open-ended question
  2. Not offensive: at this point, we do not have the bandwidth to moderate overtly political discussions. Assume best intent and be excellent to each other.
  3. Not regarding using or support for Lemmy: context, see the list of support communities and tools for finding communities below
  4. Not ad nauseam inducing: please make sure it is a question that would be new to most members
  5. An actual topic of discussion

Looking for support?

Looking for a community?

~Icon~ ~by~ ~@Double_A@discuss.tchncs.de~

founded 5 years ago
MODERATORS