Unity —The Untold Story of 16 Bits Textures
Several months ago, while working on a work-for-hire mobile game, I encountered a build size problem. My game was at the moment too big to be downloaded without a Wifi connection. And it wasn’t even finished so the build size would continue to increase. I had to find a solution.
Optimization is often overlooked during development, but it has to be done at some point. After inspecting the app size through the report Unity provides in the editor log after each build, I found that textures was a big part of the problem. In my case, Asset Bundles was a solution, but the size of the app could be drastically reduced by other means and without the hassle of setting up a whole new pipeline that bundles would have required. So, let’s find out how to handle those textures efficiently.
Texture Compression to the Rescue
OK. Textures takes a lot of place. That’s nothing new. Let’s compress them! Or not.
In my case, texture compression was a solution for SOME of the textures. But not all of them. Let’s take a look at textures suitable for compression.
Here is a background texture. It’s original size is 2048x1152 pixels. This is a 2048 texture with a 16/9 ratio. This texture has no alpha channel. In the figure below, you can see how it looks uncompressed, compressed with its size adjusted to a larger power of two and to a smaller power of two (texture compression requires generally a specific texture size, and in the case of ETC, dimensions based on power of two).
If you’re looking at this story on a desktop PC, it is very likely the compression artifacts are noticeable. But on mobile devices with higher pixel density screens, the image remains good enough, especially with moving objects on the foreground.
The Good, the Bad and the Ugly
Background images are OK, but they represent only 5% of the game’s textures. What about UI textures? Well… Not so simple.
In the above figure, we can clearly see compression artifacts on the edges of this button image. It is unacceptable. ASTC with 4x4 blocks algorithm provides much better results but it requires OpenGL ES 3.2 GPUs (even if some OpenGL ES 3.0 GPUs support it). In my case, it would have left behind many older phones and tablets*.
*In the end, Unity decompresses the textures to be able to use it on the target device, but then it consumes a lot more memory on the device which is also not acceptable. On iOS, ASTC is supported on devices with an A8 processor or later.
Back to the Basics
Looks like we are in trouble. Storage size is a priority. Memory consumption is also a priority. And we cannot sacrifice quality for that. What could go wrong?
What if we could have a way to maintain quality while decreasing storage size? 32 bits per pixel is too much. Maybe 16 bits per pixel will suffice.
Let’s try it!
In the figure below, you have the same texture set as RGBA8888 (RGBA 32 bits in Unity) on the left and as RGBA4444 (RGBA 16 bits in Unity).
As you can clearly see, the quality of the right texture is not acceptable either. But size is quite good. Obviously, dividing by 2 the bit count per pixel decreased accordingly the storage size. Now, is there a solution to keep quality on par (or almost)? No, but YES!
No, because Unity has a poor support for 16 bits textures. The conversion algorithm makes a simple bit count reduction to the nearest color.
But yes, a big YES, thanks to Texture Packer. Other tools with such capability may exist, but I’m biaised. Texture Packer is packed with so much useful stuff. And it has a command-line interface for easy automation!
Texture Packer allows you to choose among several algorithms to convert your 32 bits texture to formats like RGBA4444 (16 bits textures with alpha), RGBA5551 (16 bits textures with punch-through alpha) or RGB565 (16 bits textures without alpha) among others, with or without dithering. In this case, I used the FloydSteinbergAlpha algorithm which produces dithering. Of course, the above picture exhibits artifacts related to this dithering. But with reduced versions shown below (or on high pixel density screens), you can clearly see the benefits and barely perceive the differences, while the “nearest color” version still looks bad.
Other Use Case — Pixel Art
It is known that texture compression does not play nicely with pixel art visuals. The nature of texture compression itself goes against the aesthetics of pixel art.
Indeed, those compression algorithms tend to average blocks of pixels to reduce the amount of data needed for storage. You can see on the figure below the type of artifacts you can expect when compressing pixel art textures (or even worse with RGBA PVRTC 4 bits).
In the case of pixel art, 16 bits textures are a great deal. You might however notice small variations in colors between the RGBA8888 and the RGBA4444 textures. That’s because the original color palette of Kara, the displayed character, is not properly aligned with 16 bits colors. But if you plan your production accordingly, your textures will be exactly the same while dividing your storage size by 2. Pretty neat, isn’t it?
There are good reasons to not use 16 bits textures, notably with gradients and in contexts where color accuracy is critical.
However, there is plenty of situations where 16 bits textures will save you. As presented above, most pixel art games would gain a lot with that in mind, as well as mobile games.
Less is more, especially if more is only more.