Another one in the series of more technically oriented blog posts, this time I thought I’d write about the mask image system I’ve developed for my game. Mask images are handy for various things, especially in point-and-click adventure games.

I’m using a mask image as a floor map to separate the walkable areas from the areas the player character can not enter, as well as to set the scale of the player sprite in on each floor position to mimic depth perspective and dimension. To better integrate the main character into each scene and background, a mask works as a kind of lightmap, containing the data used to set the brightness of the sprite to match the lighting in each part of the scene. A mask marks out hotspots in a scene, mapping the active areas to the responses for different user actions assigned for them (for changing or moving objects, sprites are also used).

All in all, quite essential things for my adventure game. And the best part is, for all this, I only need one mask image per scene.

In the early phase, I used to have two separate mask images per scene: one for hotspots and one for the floor map. When I started thinking about adding lightmaps, I got really concerned about the memory used by several images – something I had struggled with earlier when creating sprite sheets for player animations. In the case of player animations, I was able to constrain memory use with image compression, with compromising a little bit of image quality, but none of the animation quality (large sprites with the amount of frames needed for smooth animations is really bad news on memory constrained devices). But this time, compression was out of question, as any artifacts in masks would result in all kinds of gameplay glitches and incorrect behavior.

In the end, the solution was quite obvious and simple – for each mask, I’m using a different color channel of the same image. Floor map with player scale values in the blue channel, hotspots in green, and lightmap values in red. I could even add yet another variable to the mix using the alpha channel, but at the moment I don’t have any use for this. There was a bit of confusion before I realized the iPhone uses different byte order for pixel data than the iPhone simulator, but hey – that’s part of the game (err.. the coding game, not the game I’m developing).

For me, the constant stream of small and bigger problems and challenges, and the joy of coming up with solutions is the one thing I enjoy in programming.