• @[email protected]
      link
      fedilink
      English
      155 months ago

      Most games never hit anywhere near that, but some large open world rpgs like Skyrim track the location of every single object in the game world. Like you can drop a piece of cheese on the bottom left corner of the map, come back 500 hours later, and it’ll still be there. now imagine all of the objects you’re buying and selling and manipulating over those hundreds of hours. Now add in a shit ton of script mods and other stuff that may add even more objects. And add in all of the quest data and interaction data that gets saved etc etc, and your save file can easily hit multiple gigabytes, with each file approaching 200mb.

      • @[email protected]
        link
        fedilink
        English
        75 months ago

        It still feels like it should be orders of magnitude less. For example, if each piece of cheese has an ID number that maps to cheese, an ID for what area it’s in, three coordinates for where exactly it is, and maybe a few more variables like how much of it you’ve eaten. Each of those variables is probably only a couple of bytes, so each item is probably only 20B or so, which means that even if you interacted with a million different items and there was no compression going on then that’s still only 20MB of save data.

        • @[email protected]
          link
          fedilink
          English
          145 months ago

          Bold of you to assume the data in save files is packed binary and not something like JSON where { “x”: 13872, “y”: -17312, “z”: -20170 } requires 40 bytes of storage.

          • @[email protected]
            link
            fedilink
            English
            95 months ago

            Agreed. JSON solves:

            • the ‘versioning’ problem, where the data fields change after an update. That’s a nightmare on packed binary; need to write so much code to handle it.
            • makes debugging persistence issues easy for developers
            • very fast libraries exist for reading and writing it
            • actually compresses pretty damn well; you can pass the compress + write to a background thread once you’ve done the fast serialisation, anyway.

            For saving games, JSON+gzip is such a good combination that I’d probably never consider anything else.

            • @cactusupyourbutt
              link
              English
              35 months ago

              protobuf does all of these (well, except compression which you dont need)

          • @[email protected]
            link
            fedilink
            English
            25 months ago

            That’s excusable in My First Game™ but surely professional AAAAA game would never cut corners and code something so lazily, eh?

            • @[email protected]
              link
              fedilink
              English
              55 months ago

              It’s not really laziness. Storing as JSON solves or prevents a lot of problems you could run into with something bespoke and “optimally packed”, you just have the tradeoff of needing more storage for it. Even then, the increased storage can be largely mitigated with compression. JSON compresses very well.

              The problem is usually what they’re storing, not how they’re storing it. For example, The Witcher (first one) has ~20MB save files. These are mostly a bespoke packed binary format, but contain things like raw strings of descriptions in multiple localisations for items being carried, and complete descriptors of game quests. Things that should just be ID values that point to that data in the game files. It also leads with like… 13KB of zero-padding for some reason.

        • @tehevilone
          link
          English
          95 months ago

          Save bloat is more often related to excess values not being properly discarded by the engine, if I remember right. So it’s not that the objects themselves take up a lot of space, but the leftover data gets baked into the save and can end up multiplying if the same scripts/references/functions get called frequently.

          It was a lot worse with Skyrim’s original engine, and got better in Fallout 4 and Skyrim SE. The worst bloat happens with heavy modlists, of course, as they’re most likely to have poor data management in some mod.

            • @[email protected]
              link
              fedilink
              English
              45 months ago

              I wouldn’t say bad, but inefficient might be fair. Unoptimized I think is more representative.

            • @tehevilone
              link
              English
              35 months ago

              Inefficient/unoptimized would be an accurate description. I think it’s important to add, for bethsoft games specifically, that the save includes all changes to objects, even if the player themselves didn’t interact with them(e.g. Physics interactions, explosions moving things, npcs bumping stuff around), and also includes all NPC changes. Master files(ESMs) get loaded, then the save loads the changes it has baked in to the databases. So, when you load up a save that has traveled the world and loaded a lot of things into save memory, the engine has to sit there and reconcile all the changes with the ESMs, which can add up quick if you’re playing modded.

          • @[email protected]
            link
            fedilink
            English
            15 months ago

            Yeah that’s why I rounded up a bit. But even if there’s triple the amount of cheese data then a million cheeses is still only 60MB

        • They would remain 0 until flipped. They did say heavily modded, so like if you added more quests to the game, these all need some way of knowing what legs have been completed. The more modded quests completed, the bigger the save. And that’s just for one thing that a save would keep track of.