Textures are an integral part of a game. It is also an area that artists have direct control to improve the performance of games. In this best practices guide, you will find a number of texture optimizations that will help your games to run smoother and look better. The overall goal of the best practices guide series is to make games perform better on mobile based platforms.
Contents of this guide include:
- Texture atlas
- Texture filtering
- Texture mipmap
- Texture size
- Texture color space
- Texture compression
- UV unwrap
- UV visual
- UV channel packing
- Other best practices related to textures
Texture Atlas, Filtering and Mipmap
A texture atlas is an image that contains data from multiple smaller images that have been packed together. Instead of having one texture for one mesh, we have a larger texture that is shared by multiple meshes. It can be authored before making the asset, which means the asset will be UV unwrapped following the texture atlas. This will also require early planning when creating the texture. It can also be authored after the asset is finished by merging textures in painting software, but this will also mean the UV island will need to be rearranged according to the texture.
Why should texture atlasing be used?
This technique will enable batching on multiple static objects that share this texture atlas and same material. Batching will naturally reduce number of draw calls.
- A lower amount of draw calls will result in better on device performance if the game is CPU bound.
- Unity's game engine has a feature that will do batching when objects are marked static, this is done without having to manually merge the objects. Further information on this can be found here.
- In Unreal Engine, batching needs to be done manually. This is done by bringing objects to 3D software and merging them or by using the UE4 Actor Merging tool. This tool will also create the texture atlas automatically. Further information on this can be found here.
Texture atlasing also requires less textures inside the game/app as they are packed together, which in turn means overall effort in making the game.
Texture filtering is a method used to improve the texture quality in a given scene. Without texture filtering, an artifact such as aliasing in certain condition (in most cases) will generally look worse. There are several options available on popular game engines for texture filtering.
Nearest/Point - When seen up close, the texture will look blocky.
- This is the simplest and cheapest texture filtering.
Bilinear - The texture will be blurrier up close.
- The 4 nearest texels are sampled and then averaged to color the main pixel. This process make the texture not blocky as the pixels will have smooth gradient, unlike nearest filtering.
Trilinear - Like Bilinear but with added blur between mipmap level.
- This filtering will remove noticeable change between mipmap by adding smooth transition.
Anisotropic - Make textures look better when viewed from different angle, which is good for ground level textures.
Texture filtering best practices
- Use bilinear for balance between performance and visual quality.
- Trilinear will cost more memory bandwidth than bilinear and needs to be used selectively.
- Bilinear + 2x Anisotropic most of the time will look and perform better than Trilinear + 1x Anisotropic, so this combination can be better solution rather than using Trilinear.
- Keep the anisotropic level low. Using a level higher than 2 should be done very selectively for critical game assets.
- This is because higher anisotropic level will cost a lot more bandwidth and will affect device battery life.
Why do artists need to care about texture filtering?
Using texture filtering will make textures look better and less blocky. In most cases this will make game look better.
Texture filtering will be at the cost of some performance, which is natural as better quality means more processing. Find a good balance between performance and visual is key to success here. Bilinear and trilinear need sampling on more pixels and require more computation.
On a more technical note - Texture filtering may account for up to half of the GPU energy consumption, so choosing simpler texture filters is an excellent way to reduce application energy demands.
Mipmapping means LOD (level of detail) for textures. Mipmaps are copies of the original texture that are saved at lower resolutions. Based on how much texture-space a fragment occupies, an appropriate level will be selected for sampling. When an object is further from the camera the lower resolution texture will be applied and vice versa.
Make sure to use mipmapping!
- Using mipmapping will improve GPU performance, as it won't need to render full resolution textures on an object further away from camera.
- Mipmapping will reduce texture aliasing and improve final image quality. Texture aliasing will cause a flickering effect on areas further from the camera.
- In Unreal Engine 4, make sure to have a texture ratio using power of 2 (e.g. 512x1024, 128x128, 2048x2048, etc) to use mipmap. Mipmap chain will not be generated when it's not a power of 2 in Unreal. Textures do not need to be square, for example a 512 x 1024 texture will have its mipmap generated.
- Unity automatically creates mipmaps on import and rescales textures that are not power of 2. More information on this can be found here.
Texture size, color space and compression
For textures that are not shared with others, it is best to have them as small as possible, or just big enough to achieve the desired quality. Having a large size texture atlas containing multiple textures that's shared between many meshes is also favourable.
Not all textures need to be the same size.
- Reducing size of certain texture (one with less detail) will help bandwidth.
- For example diffuse texture may be set to 1024x1024 and roughness/metallic map to 512x512.
- This need to be done selectively and observe if there are noticeable visual impact when applying this.
Texture color space
Most texturing software (Adobe Photoshop, Substance Painter) work and export using sRGB color space.
- Diffuse texture should be in sRGB color space.
- Textures that are not processed as color should NOT be used in sRGB color space (such as metallic, roughness, normal map, etc).
- This is because these map are used as data or unit, not color.
- Using sRGB in these maps will result in wrong look/visual on material.
Texture compression is an image compression applied to reduce texture data size, while keeping the visual quality impact minimal. In development scenarios, we export out texture using common format (such as TGA, PNG, etc) to work with, as they are more convenient to use and widely supported by various software. These formats are not used in final rendering as they are slower to access/sample compared with certain specialized format, for mobile (Android) there are several options to go with such as ASTC, ETC1, ETC2...
Texture compression best practices
- We may get better quality with same memory size as ETC.
- Or same quality with less memory size than ETC.
- ASTC takes longer to encode compared to ETC and might make the game packaging process take longer time. Due to this, it is better to use it on final packaging of the game.
- ASTC allows more control in terms of quality by allowing to set block size. There are no single best default in block size, but generally setting it to 5x5 or 6x6 is good default.
In some cases it might be faster to use ETC for development to quickly deploy on device. However, we can use ASTC with fast compression settings to get around this. Sometimes the regular encoding time without faster compression settings needed for ASTC can be longer. For the final build ASTC is a better option in term of balance between visual quality and file size.
Texture compression is handled by the game engine when we package the game. But we can choose which one to use.
- On Unity and Unreal, texture compression is all handled by the engine when we package the game and we need to choose format to go with so it is difficult to skip this step.
- Unreal has a feature to package with all texture formats and the engine will choose which format to use based on the device. This will result in larger packaged file but avoid compatibility issues.
ASTC and ETC image comparison
UV Unwrap, Visual and Channel Packing
It is widely considered best practice to keep the UV island as straight as possible.
Reasons for this are;
- It will make packing UV islands easier and less space will be wasted.
- Straight UV helps reduce staircase effect on textures.
- On mobile platforms, texture space is precious as the texture size is usually smaller than console or PC. Having good UV packing will ensure that we get the most resolution from our texture.
- It might be worth to have a slightly distorted UV and keep the UV straight for better quality texture.
Place UV seam in the right place to make them not too visible. This is for visual quality purposes, as the texture seam will look bad on a model.
Split UV island where the edges are sharp, also give some space between the UV island. This will help later for creating a better normal map when doing baking process.
Create details that can be seen (phone screens are small, so there is little point in making an intricately detailed model that can't be seen in final image). Take this into account when creating texture. For example, we don't need a 4K texture with lots of detail for a barely visible chair in the corner of the room.
In certain cases we will need to exaggerate edges (adding extra highlights) and shading to improve shape readability.
- Mobile platforms generally use smaller texture, it might be hard to capture all of the detail within this small texture.
Bake as much detail as possible
- Phone screen are small and some detail are better to be baked on the diffuse texture itself to make sure those detail are visible.
- Elements such as ambient occlusion and small highlight/specular can be baked and added to the diffuse
- This will also enable us to not rely too much on shader and engine feature to get specular and ambient occlusion.
When possible use grayscale textures that will enable color tinting in the shader. This will save texture memory at the cost of creating custom shader that enable this tinting.
- This need to be done selectively as not all objects will look good using this method. Easier to apply this to an object that has a uniform/similar color.
- Another way to do this is using RGB mask and apply texture based on the color range of the mask.
Texture channel packing
Use texture channel to pack multiple textures into one.
- This help saving texture memory, as we can pack 3 textures into 1 texture using this technique. Meaning a much less number of texture sampler.
- This texture packing technique is commonly used to pack Roughness/Smoothness and Metallic into 1 textures, but the application can be done for any texture mask.
Use Green channel to store more important mask.
- The Green channel will usually have more bits, this is due to human eyes are more sensitive to green and less sensitive to blue. Further reading on this can be found here. The roughness/smoothness map will usually have more detail than metallic and should be placed in green.
Set the texture to linear/RGB instead of sRGB color space for these map/texture.
Other Texture Best Practices
Using alpha channel
Adding Alpha channel to texture should be done selectively and added only when really needed. Alpha can potentially make texture larger (as this will turn texture to 32 bit) and this will impact overall bandwidth.
Another way to store alpha channel is by using extra channel in Roughness/Metallic textures, instead of adding alpha channel to the diffuse texture.
- In Unreal/Unity for this texture we usually only use 2 channel out of 3, roughness (G) and metallic (B) and we have 1 extra channel to use. Ambient occlusion map usually can be baked (subtly) in the diffuse map.
- By using the free channel to store alpha mask we can keep the diffuse texture to be 16 bit and smaller file size.
A normal map is a good way to make a 3D object appear to have more detail. It's best used to add smaller detail such as wrinkles, bolt and all other details that will need lots of triangles to model. Usage of normal mapping may depend on the type and art direction of the game.
In most of our internal projects we've been using normal maping and there has been no noticeable performance impact from it. However, we target high end devices for most of our demos, so your mileage might vary with this.
Using normal mapping does come with a cost, even though it might not be noticeable.
- Normal map is 1 extra texture, meaning more texture fetches, which results in more bandwidth being used.
- Use normal map sparingly when targeting lower end devices.
Normal map baking best practice
- Cage is basically larger (pushed out) version of our lowpoly model. It also need to encompass the high poly model for the baking to work well.
- This is used to limit raycast distance used on normal map baking.
- Cage will also solve problems with split normal seams on normal map.
Bake using match by mesh name (if the baking software supports it).
- This will mitigate the problem of wrong normal map projection. When objects are too close with each other they may unexpectedly project normal map on wrong face. Using this method will ensure that baking only done on the right surface, with a matching name.
- Further information on Substance Painter can be found here along with the Marmoset Toolbag tutorial
Explode the mesh (if we can't do match mesh name for baking)
- Mesh explode means moving parts away from each other, so the normal map don't project to unwanted surface. This will also help with wrong normal map projection.
- We might need to do separate bake for ambient occlusion with this solution.
Split UV on hard edges
- Continuous UV on hard edges will cause visible seams.
Set the smoothing group on mesh
- One simple rule of thumb would be to keep the angle less than 90 degree and it should be set as different smoothing group.
- UV seams need to have different smoothing group on the triangles.
Have you read our guide on Geometry Best Practices?