VFX of Sanctuary: Shattered Sun
- 31 minutes ago
- 9 min read
Sanctuary VFX development had an evolving nature. It begins by brainstorming basic ideas and various references, through trial and error, and finally refining the effects. While some effects remain at testing stages, they create new ways to achieve various other effects.
A specific task has been to create distinct VFX that separate each faction's own style. Each faction has their own unique technology they use from building their army to generating propulsion for their arsenal.

Factions
EDA, being the earth forces, have standard technology that everyone is familiar with, although for bigger units the tier of these technological weapons are cranked up a notch. EDA ground and air units have advanced tanks and drone technology, so the references we explored are mostly from modern military. From rocket launchers to railguns, EDA VFX is an advanced earthly force that combines conventional warfare methods. For these, most VFX comes from simulations of fire and smoke with a lot of pressure applied. EDA effects pack a punch of detailed explosions, sparks and unit debris flying everywhere.
On the other hand, Chosen have mastered the liquid plasma technology that propels their most destructive forces. The burning liquid plasma can be utilized through rockets, artillery strikes and be shot by air units. Just the same way, when their units die they crash and burn with blue persistent plasma boiling the environment. Their more organic designs are also translated into the VFX, curving and bending with various forces. Chosen VFX is more centred around liquid simulations as their physics carry more physical weight as opposed to classic explosions.
Meanwhile Guard has excelled in integrating forces used by celestial objects such as the sun and its magnetic properties. Their technology can effectively destroy enemy units as if they can unleash the burning fuel of stars packed up in bombs and rockets. They have also integrated their technology with strong lasers that can apply massive damage to enemy strongholds. Guard VFX ranges from fiery blasts to molten metals bending in magnetic fields. Things like the magnetic burning surface of the sun, incendiary modern ammunition and laser weapons have been main inspirations for the Guard VFX.

Trial and Error for Various VFX
Certain VFX can be streamlined and applied to most effects such as explosions, smoke and smaller particle effects like trails and sparks. Although to implement more creative ideas sometimes an asset needs to have various complex features at once. And while it's always fun and self-teaching, some of these experiments result in useful assets while others have to be discarded as ideas that partially work but are unsuitable for games. Nevertheless they can give more insights on what works and what doesn't for future references.
Cliff side meshes are a very specific type of skinned mesh assets that we use to decorate the environment. It is very common to hit the limitations of the conventional terrain system in game engines. In that case a good application is to leave as little detail in that part of the terrain as possible and kit bash more detailed assets on top of it to fill the detail gaps. Cliff side meshes are a capable solution where a high poly mesh with animated materials that give life to the terrain and game world can be fitted into various uneven surfaces using the skinned bones. It can also be used to connect the terrain and ground units in a seamless way as if it's an extension of the terrain without having to overload the terrain capabilities. After shaping the assets, we bake the skinned mesh in its final shape and position to also eliminate the skinned mesh and have it as a static mesh on the map. That way the workload is split into separate assets where each of them can be removed or optimized based on its use case.
Terrain decal details on the other hand can easily change the appearance of the terrain as well as units with minimal workload on the GPU. It is very useful for pushing the boundaries of terrain layers that assign different materials to each of its layers as well as its material complexity limitations. We also use these decals with additional shader modifications like parallax offset to fake extra depth layers without having to modify any existing geometry. This helps the artists a lot with playing with details on the spot without having to rebake anything.
The strategic overlay map is also created using decals. Using the available information that the decal shader provides eliminates the need to bake any topographical maps based on the terrain height and can adapt to any terrain geometry changes. By calculating world position and assigning technical colour map type shading to it, we can create simplified war map type visuals.

Environmental VFX
There are various environmental VFX that decorate the background as the players battle on the map. From huge broken pieces of gigantic rings to split open landscapes revealing the underlying structure of the tech-driven planet. While some of those are still on testing stages, they require the general background prop solution that most games adopt: detailed enough but not consuming too much workload for the engine. So it's usually a good idea to bake most background assets to either 2D cards or have low-poly versions of them with a lot of packed and baked detail that does not need to be seen up close. This can either be done by baking everything to the skybox, creating matte plates with many layers to give the sense of depth, or simply having simplified meshes that fill the horizon.
For our case, there is no limitation on how far the camera can move across the playable area. So it can be played top-down but also viewed from side angles where it sees the horizon. This created another challenge where the background details had to be stitched as they merge with the playable area of the map. While there are various solutions to that, they all require a level of tweaking and baking to fit them to existing map assets and don't always work with the kit-bashing method. Which is why allowing time for trial and error can yield useful results to find the optimal solutions.

The Specific Challenges of RTS VFX
It is not uncommon to run VFX without worrying too much about extremes, but when there are near thousands of units all shooting and exploding at the same time along with idle environmental VFX, it is very easy to choke graphics performance without any specialized optimization.
A major optimization we have for the VFX is to cull the amount of particles visible on screen so that no matter how dense the battle gets, the rendering of special effects does not exceed its quota. This is done by enforcing a script that collects all the instanced particles on the entire map and sets a custom amount of particles rendered only near where the camera is. That way in the development process we can have a good understanding of how the VFX will impact performance at its most intensive scenarios. It also makes it easy to scale the quality options by simply changing the capped particle settings.
Another big issue is overdraw, where multiple particles are rendered on top of each other, which graphics cards really don't enjoy. While mesh renderers can have native culling solutions where occluded models can be skipped from rendering, the transparent nature of particles makes this a harder problem to tackle. An advantage this and many RTS games have is that smaller units are easier to produce, which usually creates smaller particles. Just as well, bigger units are expensive and scarce yet they can create larger, long-lasting particles that can become unavoidable for overdraw.
So in this case there needs to be a balance, packing as much information into a single quad while also using special meshes that can fake volumetric effects from specific angles. It's also important to understand what type of effects work and what doesn't. A good example is where additive blended effects like sparks, plasma and fire can be adaptive to quads that can be viewed from many angles due to their wispy structures, but a large plume of smoke can look odd from a diagonal angle right away. In that case, smokes all have to face the camera yet fire and sparks can have 3D meshes that don't need to orient towards the camera and can fake enough 3D detail to give a more volumetric effect. Mixing these and testing various ideas is good practice to understand how to mitigate the limitations of conventional VFX.

Dissecting the VFX to Get the Most Out of It
It is vital to pack as much information as possible into a single effect to fully make use of that one rendered quad. There are new methods like 6-way lighting that can fake the light information of a transparent object without needing any volumetric data. And while this creates a much more convincing look for transparent particles, it doubles the amount of required textures for each effect and creates more shader complexity by sampling the texture numerous times. There are also more conventional solutions like simply baking the effects with generic lighting and using methods like vertex lit shaders to fake lighting on them. These will create much less convincing results but are much easier on the graphics workload.
In our case, it's a mixed solution where direct lighting, ambient lighting, emission and alpha are all baked into a single texture with the assumption that the lighting scenario will be similar in most cases. The light is generally coming from above with a slight deviation based on the map skybox and sunlight direction, so the particle effects are baked with lights coming from above in a generic sense. The advantage is that if a different map doesn't have much sunlight, the material properties can be adjusted to remove the direct sunlight channel, leaving only ambient lighting on a VFX such as a smoke plume. Or if there is a sunset scenario with strong orange sunlight but with a blue hazy atmosphere, these can easily be set across all the particles as a global setting since they are all prepared using the same global format to pack lighting information.
Various other methods involve giving each particle a good amount of variation while spawning a pack of them. An explosion VFX can spawn 5 explosion particles to create a more volumetric look when the camera is rotating around these camera-facing particles, but since the lighting information is packed into different channels they can all appear different enough to break the copy-pasted look. The same explosion can sometimes be brighter, sometimes darker, sometimes it can have more fire and fuel, and sometimes can have low opacity. Aside from lighting, giving variation to its rotation, lifetime and scale is also a complementary method for adding more uniqueness to each spawned effect as much as the engine allows it. Adding these variations is a simple yet very effective way of adding life to the VFX without needing any extra resources.

Using Simulations to Create and Bake VFX
Thanks to the wonders of modern rendering and simulations, we can create many fluid simulations such as smoke, fire, liquid and things like crumbling buildings with all the complex physics involved. The game development challenge comes from how to transfer these into a real-time game where it's impossible to simulate anything on the same scale in real time. A very specific example is rendering a single looping smoke particle. While it is now relatively easy to bake a smoke sim into a flipbook, there are important factors to consider. If the shape of the smoke is too obvious, it will stick out as the same particle in each of the effects, and if it's too spherical it will lose its identity as an organic smoke effect. So it can take a while to find the right organic smoke rendering while battling with the unpredictability of various simulated forces, but it is worth going through the phases until it finally feels right.
Another challenge comes from flipbooks. The texture flipbook system for particles is possibly the most useful and conventional solution used today in every modern video game that requires simulations to be baked onto a single texture. To prevent the frames from jumping from one step of simulated timeframe to another, there are solutions like frame blending and motion vectors where each step can warp into the next frame as much as possible to create a smooth transition between frames. But there is a negative correlation between how much detail you can get per frame and how many simulation frames you can have. So the common challenge is finding the sweet spot for how many frames an effect should have for its simulation animation versus how much detail it should carry per frame. And sometimes you need to push an extra edge where detail is sacrificed in favour of the frame count. For this reason we also added a feature called microdetailer, where a smaller additional vector map can warp each individual frame to sharpen or morph it based on its use case. Particles like flame and plasma can heavily leverage this technique due to their spline-like nature. Even smoke can use microdetailing to make it more puffy or give it a tendril look when required.
The VFX work, like all other arms of game development, can be a back and forth process. While certain streamlined tasks can easily make their way into the final product, when a less common challenge appears with various possible solutions, it takes extra time and effort to understand how to solve it best. And while it's an enjoyable challenge for VFX artists, it also allows creative ideas to be tested and applied as working solutions. So it is important to welcome challenges and push yourself into exploring new ideas as well as mixing them with conventional ones.
