Our game, Gravitas: The Arena, is using a custom high-level networking solution built on top of Lidgren.Network. We’ve had a few lingering networking bugs for awhile now that have gotten buried under other priorities. Today I sat down to try to solve all of them, and luckily enough managed to do so.
Aside from a few minor bugs, there were two remaining major issues that I wanted to address today:
Intermittent, extreme packet loss over LAN
Packets not coming through before a notable delay over the internet when using UDP hole punching (but not Hamachi)
I’ve been working on a miniature game engine (lovingly called Mingine) as a little side-project lately. It’s very barebones right now – pretty much just a platform layer based on SDL with a sprinkle of WinAPI, a mostly-functional graphics API abstraction layer (with an OpenGL backend), some image loading code, input handling…
Note: The really interesting bit is at the end of the article. So if you don’t stick around for the whole journey, at least skip to the end! That being said, this is all useful stuff so you should, yeknow, stick around…
Unity is a pretty solid game engine that at least a few people are using, but it doesn’t play very nicely with Git – my preferred source control mechanism – by default. While I can’t speak for everyone, there are three major issues I’ve always wanted to address:
The default setup all but guarantees that having two different users push the same scene, prefab, ScriptableObject, etc will render it totally unusable and force you to rollback (quite painfully) to an old version.
Git, by design, doesn’t play very nicely with binary data. It’s diffing tool doesn’t support binary files and, as such, it stores the entirety of each revision of binary files for every commit. As a lot of game content is stored as binary data – FBX files, textures, etc – this presents a bit of a problem for storage.
After playing with Unity Collaborate a bit, it became extremely evident that the UI/UX for interaction between traditional source control and the Unity editor could be drastically improved.
Lately, I’ve been working on a fast-paced VR multiplayer shooter game in Unity with some classmates. Since we’ve had negative experiences with UNET in the past and knew we would need to optimize the netcode pretty carefully if the project was to play well, we decided to build a custom networking layer on top of the fantastic Lidgren UDP library. Most of my time has gone into building the networking layer from the ground up (which has been a total blast). Continue reading “Optimizing Networked Strings as a Prebuild Step”
I almost called this “Why you should always remember that games are illusions”, but that sounded way too pretentious and, honestly, kind of sad.
This weekend I participated Global Game Jam 2017, wherein the theme was “Waves”. Our game, Bermuda Dieangle, involved controlling the ocean surface to make boats crash into each other. Initially, we planned on running a complex wave simulation to make the boat and water physics extremely realistic and even had code in place to approximate it (one iteration on the CPU running on a background thread which ended up being too slow, and another using the GPU to propagate the wave forces, which we never got to work quite right). But in the end, given our relatively limited experience in physical simulation and extremely limited time, we decided to fake it. Without going into too much detail, each wave was essentially represented by a sphere which grew until it had no more energy. Each boat checked if it was within the sphere of influence of any of these waves and, if so, applied a force away from the wave’s origin. Since there were so few waves per frame, this was really cheap to compute and easy to write. For the visualization, we took this data and generated a heightmap of the ocean surface by rendering each wave to a render texture as the sine of (the distance from the center / the current radius of the wave) + time + a random value unique to each wave (specifically, we had an orthographic camera pointing down towards the fluid surface and drew a radial texture with extra math being done in the fragment shader for each wave). This was also extremely cheap and ended up looking really, really good (especially for the low-poly aesthetic we were trying to achieve). Not only that, we were able to implement it in a couple of hours, compared to the 10+ hours we spent trying to figure out how to run a more physically accurate fluid simulation which never actually worked. Furthermore, it should theoretically run on mobile or the web, as opposed to either of the more complex solutions where that would have been totally impractical. And since all of the computations are pretty simple, buoyancy becomes feasible without copying any data from the GPU to the CPU because you only have to sample at a few points per ship/per wave (even though our buoyancy was broken at the end, it should be pretty simple to get in and fix).
So in short, don’t bother computing anything that you don’t have to. And for the love of all that is holy, if you’re a bunch of college kids with very little experience in wave dynamics and not enough time to do thorough research, don’t try to write a real-time wave simulation/propagation system in a hackathon where that’s only a small part of the project. Just don’t.