Ventures of an ex indie game developer

Network and portability

Today I conducted a networking test with our newly purchased iPad 2 (which btw can be bought at half price due to tax legislation if you're running your own business in Sweden). It's far from perfect yet, but it will be good enough eventually.


This was my setup: my Intel Atom Ubuntu ran the server, I had one WinPC desktop and one laptop, one Mac mini, one iPad 2, and one iPhone 3GS. I was surprised by the performance on the iPad 2, it felt smoother than my main PC! The iPhone 3GS rendering performace is really crappy, so I have to optimize some. But not before I do that, I'll add some vehicles, levels, envmapping and beauty. (I really liked the gfx in CSR Racing, and apparently I was not alone.)



I had hard-coded the server address into them little buggers, and everything just worked. With a few glitches. For instance, running two split-screen clients on the same machine, but without local server didn't work too well. The network positioning packets of the "opposing vehicle" shouldn't be handled by a splitscreen client (in the same way as you would - mostly - disregards any reception of your own vehicle/avatar positioning packets, after sanity checking of course).

There's always a stack of new things popping up when you solve the old ones in this type of portable environment. This game is server-client only (even when running on your own), but that means that I need to handle the suspend/resume states that happens on touch devices (but never on computers). Hadn't seen that one coming. And there's plenty more, in all fields of porting, where that came from. Here are some porting problems I've run into lately:
  • iOS devices has an Airplane mode, computers don't. In this mode, the iOS device has no IP address, not even a bogus one.
  • OpenGL quads not supported in GLES1, so had to use triangle strips for particle system.
  • Apple doesn't think it's valuable or good for a developer to know if you're running on an iPad mini or if you're running on an iPad. They have the same resolution, but one is mighty smaller. Using the std library, components will layout fine, but rolling GL they won't. Took me some hours to find out how and implement.
  • Windows is case-insensitive in file system; Mac & iOS is case insensitive in file system, but case sensitive in (some) apps; Linux is case sensitive, period. This was more annoying than time-consuming.
  • Input is very different in these three systems, forcing them all through the funnel took some schtick-poking.
  • Socket implementations differ on my four target systems, in minute details.
  • Using localhost and a remote host isn't the same; and especially so if running the server in-process. Fixing these things takes time.
  • Some years ago I added a type of "steering unit" (i.e. a motor) which would operate relative to the camera's orientation. Good luck running that code on the server. Even if you could, you wouldn't want a camera on the server. So I had to rewrite that, and uses of it in my games; KillCutie, Push and upcoming "HoverTank", or something like that.
  • Network jitter must be considered in the most unexpected places. My bomb planes drop their pay load in a steady pace on the server, but if they show up in a different pace on the client, due to network jitter, they collide with each other, and blow up next to the bomber! My solution was to use a longer delay... :-o
  • If you load an object, say a level, and know it's using a different coordinate system, you'd transform it before use. If you do that with in-process server and client separately, fine. But when you move the loading into the server (so it can be dynamically replaced), and then send it to the client, the client better not try to transform it again before use. Yep, happened to me, and similar things with objects and vehicles. IMHO you really need to be stringent with transformations. That's especially true for me, as I use quaternions which are non-commutative, as opposed to matrices.
  • I try to use physics simulations as much as possible, so instead of using position/velocity shortcuts I set the force+torque. I somehow imagine this making the game more "real". Most people would regard this something bad, and it's costing me a lot of time to get half-way to decent. And it causes porting problems between in-process server/client and remote server/client.
  • A touch screen's physical size is important to know if you're going to keep user controls there. What you actually need to know is DPI, as you probably want similar controls on all touch platforms.
  • Prototyping OpenGL is harder on touch devices as GLES doesn't support immediate mode.
  • If you're running a distributed simulation, you want the physics simulation to tick in the same pace everywhere or they quickly start to diverge. That also helps you reduce network traffic. A friend who worked on DICE long ago informed me that on consoles you either manage 60 FPS, but 59 equals stepping down to 30, because console manufacturers don't want jittery game experiences for their end-users. That's the graphical part. The simulation side works the same: either server and all clients manage to run at full throttle or they all divide FPS by two and run at that rate. This was a pain to implement for the simulation side, and I still just run gfx in the same FPS as the simulation. I figure that when I reach that quality on the intestines of my code, I'd better have delivered a few titles before that which payed for it.
  • My resource system is the one major part that has felt shaky and I almost gave up on it a year or two ago when it was causing me grief. Originally I chose to write it to support asynchronous loading, so the entire game wouldn't stall when you walk up to a big building. I also chose old-school callbacks which easily can cause threading issues. I haven't had any bugs here in some time now, but it feels like I've spent forever fixing bugs here in the past, so perhaps I should have picked something like Stackless Python from the start. Not sure, but most systems have their draw-backs; I did roll some Python in the engine before, but found no good use for it. Eve Online runs Stackless, and they think it's good, but who knows.
  • When you're running a game, you're running a physics engine and a rendering engine. But if you're running an in-process server and two split-screen clients, that for me means three instances of the "scripting engine", three consoles, three network interfaces, and so forth. So for instance, your three scripting instances can't all tell the physics simulation engine "go push object xyz", as the push would be three times as hard. The realize-find-fix iteration in this type of scenario is neither quick nor clean.
  • Split-screen clients both use touch controls on a touch device. Layout accordingly. It's easy to overlook this on your first iterations of a game, and it takes some time to port if you hack your way there. This time around it was much easier, and it wasn't mainly because I'd done it before, it was mainly because I didn't hack it together this time, I used proper transformations instead of "if 'left side AND on touch device' move left touch area N pixels north and assume input in the U direction means move forward, or 'if right side AND on computer'"...
  • Touch controls are a bit awkward at first. A button in traditional UIs is something you press-hold-release. You don't use it to repeat, such as "shoot", but on a touch device you do. A normal joystick doesn't have a conical space to move in, even though the designer wants you to think that. The space is pyramidical, but rendering and using a square control area in a touch device both looks and feels totally wrong. So a circular control area will somehow have to map into a square one, which I'm not yet sure exactly how to do in a normal game. In my current game it's fine though, as the hovercraft can be moved in all directions.
  • Sound quality, amplitude and pitch all feel different on the iOS devices, which needs to be compensated. I will do it programmatically mostly, but I have only touched upon it yet.
  • A touch device needs fewer degrees of freedom. One of the most popular iOS games ever, Tiny wings, has one single, boolean control. That's it! My guess is that's why all the 3D-physics/gfx games are doing so poorly on the platform.
  • A touch device can be turned upside down. And the player doesn't expect that to go unnoticed.
  • I use quite a few 3rd party libs. Some of these, or their upgrades, won't be portable out-of-the-box to, say, iOS without some manual fixes.
  • The physics simulation runs in a separate thread from the graphics. If there are >1 CPU in the system that is (otherwise they run sequentially). I like this kind of thing, so of course I had to write the code to check CPU count myself. On Windows, Linux, Mac and iOS. That's a few hours of my life I'm not getting back. Oh well, I'm going to live for ever so I don't mind.
  • The combinatorial explosion of most normal things; levels, vehicles, weapons, controls, rendering, sound, music, physics simulation, network, server/client, resources, menus, AI, PvP, path finding, balance, in-app purchases, and so forth; is more than plenty to deal with as they intertwine in the most elaborate ways. Adding a game engine with four different operating systems, four different "libc"'s on top of that is one heck of a job to overcome.
But now, I'm really close to getting somewhere! YEAH, THUS FAR I DID GOOD!

About the author

Mitt foto
Gothenburg, Sweden