For me, I recently had to revamp something because I was taught to use PlayerPrefs for saving all game data and had to move everything to a JSON in order to make cloud saves work or even just transfering save files to other devices.
I’d say abstract, but only abstract as needed. It’s easy to get deep in the weeds future proofing, but spending all your time on interfaces is a surefire way to burn out.
Definitely. You’ll probably be able to tell ahead of time if what you’re building should be reusable in different contexts/projects (e.g. an input system). But for more gameplay-specific code? Just make it work for whatever your game needs it do right now. Who the hell knows if you’ll ever actually use it again, and if you do it’ll probably still need tweaking anyways.
Don’t extensively hardcode strings of dialogue into your game! I’m having to think about localisation for the first time and it is really really arduous to replace those thousands of strings with IDs to call from the localisation database.
Feel this too much! I’m fortunate that most isn’t hard-coded in, but still…
Test if your build runs before you upload it, even if you think you didn’t make any changes that could break anything -_-
The voice of bitter experience right here.
Haha, not really bitter, but embarrassed
Profilers for diagnosing performance issues.
I had an experience where my general basic rendering knowledge (lots of draw calls / polys = bad) got me complacent in solving performance problems. I saw low FPS, I started simplifying meshes. But that’s not always the case, there can be runaway code, memory issues, specific render passes etc.
In my case, I was trying to get a Unity game to run on a PS4 devkit but it kept crashing on a certain level. I wasted a lot of time simplifying the meshes used in that scene before jumping on a call with our tester (who had the devkit and was also inexperienced) and remotely profiling the game to determine the root causes.
This turned out to be a memory overload. The amount of functional RAM/VRAM you have on a PS4 is actually pretty limited compared to a desktop PC. In our case, there were several things ramping it up and over the limit:
- Unity’s static batching creating new combined meshes which added to memory cost
- Like batching, mipmaps also generate new copies of a texture which take up memory
- Excessively high-resolution textures for simple patterns (we hadn’t considered Texel Density at all for that project)
- Erroneous use of high-memory textures where it was not necessary (e.g. a Visual effect was being driven by a 4k pure white texture, instead of just a vector colour)
So now, while my knowledge has significantly improved from experience, I make use of profiling wherever possible to confirm what a problem is. As the saying goes; you don’t want to just mindlessly create solutions, you want to identify and solve problems
All libgdx related:
- not working with the asset manager
- not working with an atlas for assets
- not referencing textures, but initializing every time
as a libgdx user myself,
not referencing textures, but initializing every time
I’m fairly certain if you use the asset manager, you get textures and other assets passed by reference by default. I think this kind of kicked by butt at first with particles, since I needed to thenso something with particle effect pools, instead of just loading them as a particle effect in the asset manager to begin with.
not working with an atlas for assets
While I understand what an atlas is, I’m not sure how large of an atlas I should use or try to get away with. At the moment I usually put every different ‘object’ as it’s own sprite sheet.