This is one of the parts GOG are doing way worse than Steam. Even with a 1GB limit I still have to constantly remove save files from Pathfinder WotR to make it fit inside the cloud sync. 200MB is ridiculously small.
On Steam it's per-game configured by the devs, no? Crypt of the Necrodancer tells me it has nearly 100GB space left, while Deep Rock Galactic says it's capped at ~85MB.
Yeah devs get to set the limit. (Source: am a dev w/ a game on steam)
Why would you want to set a smaller limit?
No reason really, there's just no point to set a super high limit if your save file is a 5kb text file.
Also valve does review the game and might not like an absurdly high limit, but I don't know if they actually care or not.
like beeing seen by others not putting away the shopping cart, now everyone knows you're incapable of the basic decency test
When it's configured by the devs they can set limit appropriate to their game's save file. Pathfinder got massive save files (there's even mods to try to reduce the size) compared to most other games, especially linear ones. It seems like GOG is setting a global limit
A heavily modded Skyrim/FO4 save with hundreds of hours on it can easily go above 200mb.
That's insane, what's making up all that data?
Poor optimization of save files probably
Must be, if there's no real limit then why would they bother?
Most games never hit anywhere near that, but some large open world rpgs like Skyrim track the location of every single object in the game world. Like you can drop a piece of cheese on the bottom left corner of the map, come back 500 hours later, and it'll still be there. now imagine all of the objects you're buying and selling and manipulating over those hundreds of hours. Now add in a shit ton of script mods and other stuff that may add even more objects. And add in all of the quest data and interaction data that gets saved etc etc, and your save file can easily hit multiple gigabytes, with each file approaching 200mb.
It still feels like it should be orders of magnitude less. For example, if each piece of cheese has an ID number that maps to cheese, an ID for what area it's in, three coordinates for where exactly it is, and maybe a few more variables like how much of it you've eaten. Each of those variables is probably only a couple of bytes, so each item is probably only 20B or so, which means that even if you interacted with a million different items and there was no compression going on then that's still only 20MB of save data.
Bold of you to assume the data in save files is packed binary and not something like JSON where { "x": 13872, "y": -17312, "z": -20170 } requires 40 bytes of storage.
Agreed. JSON solves:
- the 'versioning' problem, where the data fields change after an update. That's a nightmare on packed binary; need to write so much code to handle it.
- makes debugging persistence issues easy for developers
- very fast libraries exist for reading and writing it
- actually compresses pretty damn well; you can pass the compress + write to a background thread once you've done the fast serialisation, anyway.
For saving games, JSON+gzip is such a good combination that I'd probably never consider anything else.
protobuf does all of these (well, except compression which you dont need)
That's excusable in My First Game™ but surely professional AAAAA game would never cut corners and code something so lazily, eh?
It's not really laziness. Storing as JSON solves or prevents a lot of problems you could run into with something bespoke and "optimally packed", you just have the tradeoff of needing more storage for it. Even then, the increased storage can be largely mitigated with compression. JSON compresses very well.
The problem is usually what they're storing, not how they're storing it. For example, The Witcher (first one) has ~20MB save files. These are mostly a bespoke packed binary format, but contain things like raw strings of descriptions in multiple localisations for items being carried, and complete descriptors of game quests. Things that should just be ID values that point to that data in the game files. It also leads with like... 13KB of zero-padding for some reason.
Good points!
looking at you x3
and rimworld
Save bloat is more often related to excess values not being properly discarded by the engine, if I remember right. So it's not that the objects themselves take up a lot of space, but the leftover data gets baked into the save and can end up multiplying if the same scripts/references/functions get called frequently.
It was a lot worse with Skyrim's original engine, and got better in Fallout 4 and Skyrim SE. The worst bloat happens with heavy modlists, of course, as they're most likely to have poor data management in some mod.
Aha, so unexpectedly it's bad/inefficient code that's ultimately to blame
I wouldn't say bad, but inefficient might be fair. Unoptimized I think is more representative.
Inefficient/unoptimized would be an accurate description. I think it's important to add, for bethsoft games specifically, that the save includes all changes to objects, even if the player themselves didn't interact with them(e.g. Physics interactions, explosions moving things, npcs bumping stuff around), and also includes all NPC changes. Master files(ESMs) get loaded, then the save loads the changes it has baked in to the databases. So, when you load up a save that has traveled the world and loaded a lot of things into save memory, the engine has to sit there and reconcile all the changes with the ESMs, which can add up quick if you're playing modded.
Each object also needs the orientation, possibly also velocity and angular rates.
Yeah that's why I rounded up a bit. But even if there's triple the amount of cheese data then a million cheeses is still only 60MB
Variables.
Of course! But wouldn't it save space if these variables used zeroes instead of ones?
They would remain 0 until flipped. They did say heavily modded, so like if you added more quests to the game, these all need some way of knowing what legs have been completed. The more modded quests completed, the bigger the save. And that's just for one thing that a save would keep track of.
Is it? Pathfinder seems more like the exception than the rule. I've got a big library on GoG and none of my games even reach a quarter of the 200 MB limit.
Should probably have been more clear that it's extremely small for pathfinder. And since GOG is setting a global limit and they are selling pathfinder on their storefront, their global limit is too small.
I'd more argue that the game company should be finding ways to reduce their save file size. 1GB seems ludicrous, though I don't know the system enough to know the technical reasons behind that. This is still a strange business decision for GOG as they don't have the market share to move the needle, the games affected will just sell less on their store until the game company doesn't even bother with it.
While I do agree that Owlcat could do a better job with their save file system, from the point of view of the consumers it shouldn't be their problem. If GOG sell their games and offer cloud sync, they should provide adequate amount of space. Storage is relatively cheap.
Are you forgetting the files are also taking up space on the consumers drives?
Edit: additionally, larger save file size typically(but definitely not always) means longer loading times. There are tangible consumer benefits to reducing save file size.
I think you misunderstood me, I never claimed that excessive file size can't be a problem for the end user. I was saying that in regards to cloud save, large file sizes shouldn't be a problem for the end user. It's a problem GOG should take up with the developers they allow selling on their storefront with GOG's advertised feature set.
I'd like to challenge you on that edit of yours though. On an SSD the time it takes to load a save file into memory is negligible, and in almost all cases less than the game assets the game loads up when you start a game. The complexity of the game world is the dominant factor.
Fair points for both, we're on the same page for the first point. GOG should be doing exactly that to mitigate the issue, and hope they have been but haven't been as successful with that technique. I'll give them the benefit of the doubt for it.
For the second, I agree that the majority of the issue are the storage space themselves, the others are tangential concerns. To me, a company that struggles to limit their file size has a poor take on how they implement features, it's a red flag that there are likely much bigger performance issues with the code. One doesn't mean the other has to exist of course, but they show up together fairly often.
I'm personally tired of game companies just throwing shit at the wall and not caring about the performance. They (well AA and bigger companies mainly) seem to have completely lost interest in doing anything other than the bare minimum. Does it work on the absolute latest hardware? Must be good to ship.
Save files have a ton of variance. They can be as small as a few KB or they can be full save states of an entire open world. Back in the mid 00s, I had save file folders that were larger than the install directory, like The Witcher and Prey (2006).
Do I even have cloud saves if I just use the DRM free installers and not GOG Galaxy? 🤔
Nope.
I don't believe so
There's a foss launcher called Playnite that combines all your other launchers. it can run scripts before/after you launch the game. I just setup an rclone script to backup that games save directory
The funny thing in my case is that the only game I have going over that limit right now is Cyberpunk 2077.
Games
Welcome to the largest gaming community on Lemmy! Discussion for all kinds of games. Video games, tabletop games, card games etc.
Weekly Threads:
Rules:
-
Submissions have to be related to games
-
No bigotry or harassment, be civil
-
No excessive self-promotion
-
Stay on-topic; no memes, funny videos, giveaways, reposts, or low-effort posts
-
Mark Spoilers and NSFW
-
No linking to piracy
More information about the community rules can be found here.