Rendering OverviewThe focus of the rendering has been to make it all run well on older iOS devices (minimum target PowerVR SGX543, an old OpenGL ES 2 chip). I was originally meant to release this game about a year back already, so the target is already quite obsolete. If I would start things now, I would naturally do many things differently (and target higher perf devices anyway), but such is life!
I'm not a rendering expert and this is not meant as any kind of ideal way to do things, this is just the way things ended up as I made it up as I went. This is also a high level overview, if there's something interesting here and you want me to go into more detail on something, just let me know!
The .gifs are a poorly compressed, the smearing etc. and low color fidelity is not something that's really there...
LayersOff screen render targetsFirst I render all off screen render targets, all of these are top down view of the level.
1. Level tilemap colorThe multiple tile layers of the entire level from Tiled are cached into one offscreen target. This is only updated when needed, like when first coming into view or when tiles are changed dynamically during gameplay (like adding some damaged terrain on top or something like that).
2. Ambient shadowsOne channel target (low resolution on some devices) where I render ambient shadow sprites for all entities that have
one and are in view.
3. Point light shadow mapsAll point lights that have shadows enabled render a 1D shadow map of their occluders into one horizontal line in the point light shadow map texture.
4. Combined level color and lightingThis is where I combine all the previous stuff for the final level mesh texture. Only in view parts, roughly culled, are rendered again.
First the tilemap color is rendered with the level mesh (combining some ambient/diffuse lighting into it). Then ambient shadows are rendered on top. Then point lights, which are additive sprites with shader doing shadowing based on the shadow maps, are added. Lastly "ground splat sprites" on top, used for some indicators like player front direction arrow or some impact warning effects.
5. Water depth mapVery low resolution target where I render a top down view of the level depth (used for coarse depth checks in water rendering)
BackgroundNow we start to render to the actual screen. First render the background "plasma" and stars. I also don't fill the whole background, but generate a shape that fits the area that the level doesn't overlap, but just enough so I get nice alpha blending on the borders. A bit heavy shader where I also cut features based on if the mobile GPU is powerful enough. Gif shows the individual triangles of the background mesh toggling on and off in the final frame.
Level to screenThen it's time to render the the level mesh to screen. This is a runtime generated mesh that is textured with the combined level texture render target we prepared before. The parts that will be fully below opaque water are not rendered.
WaterGenerated mesh for water with a specific water shader. The shader does the effect with a few scrolling noise textures, foam for the shores based on the water depth map and coloring with some light params etc. and the level bottom color.
Level bordersSeparately generated mesh for the level "drop borders". Two textures, one for the basic ground color and another for a bit of grass on the top, which again samples the level color map to match the lighting to the level.
Then there's also a separate mesh and shader for the parts where there's water on the level border.
SpritesThis is by far the largest task on the CPU side. All in-view particles, Spine meshes, sprites and grass sprites are gathered and sorted back to front. Then I generate one large mesh to render them all with one draw call. All these use the same texture (combined 4k sprite map).
There's a few bits of extra GPU processing on top of basic sprites to faciliate doing all these on one call so the vertex ends up a bit fat. The vertex has 2D pos (screen space), diffuse UV, color, ground UV, ground color multiplier and wind factor.
The sprites/spine meshes/particles themselves are generally lit on the CPU side by finding closest point lights and calculating the color from that.
The ground UV and ground color multiplier are used to color things based on the ground lighting. This is what gives the grass its color. The grass texture is white and I multiply it with whatever is sampled from the combined ground render target so the color always matches nicely. Some particle effects also use this, for example some near-ground dust uses a bit of it to give it a color that blends more with the ground.
The wind factor tells how much to "wave" the vertex based some ad hoc sin/cos "wind" values. For example grass gets large values so it waves much in the wind, trees get a bit so they too wave in the wind and buildings and units get 0. I also use the ground UV as "seed" for the wind so it's positionally stable. The .gif shows me trying some exaggerated wind parameters to highlight the sprites that use it.
EffectsSpecial effects are just rendered on top with no sorting, things like trails, lightnings, flares, the swipe effects from player axe etc. Some of these would benefit with from sorting, but generally they are so large and complicated in 3D space that they would either still look wrong or be too complicated to sort with not much benefit. I try to also keep them fast so there's no time to look if there's something very "wrong".
Additive flare/bloom sprites for lights are rendered on top here too. I also had an occlusion test for flares at some point too, but the effect was so unnoticeable and useless that I removed all the code for that in the end (after discovering that it had been broken for a month and I hadn't noticed).
UIThen we just render all the UI sprites on top. Fonts are basic bitmaps fonts, generated with Glyph Designer. And then the frame is done.
I think that's it! Might have missed something, but nothing obvious. I'm always happy to answer questions, if there's any!