Stencil Portal in Unity — VFX Breakdown
A short break down of the use of several VFX tricks in my latest scene. This includes setting up a stencil portal, dealing with render order issues, as well as mixing baked and procedural animation, stylized ghosts, volumetric fog with lighting, flames and texturing of the scene.
As usual, you can find the code on my github, and the meshes used here you on my Sketchfab:
Mesh, Cat+Animation: https://skfb.ly/oqywC
Mesh Altar and Surround: https://skfb.ly/oqywF
Modeling and Texturing
Since the post is not about modeling, texturing and animation, I will simply gloss over these quickly. The textures are all hand painted in one big texture. This includes the lighting of the moon, shadow of the cat as well as the lighting of the candles. Nothing fancy here, painting was done in Blender with the standard brush.
For the baked animation (Cats body movement, tail movement, ear twitching and bats flying and the swinging of the spider web), I made very simple rigs in Blender. I am not using IKs, everything is keyframed through eyeballing the desired motion.
The modeling was done based on a series of concepts I drew, which was the starting point.
The portal was the main part of the VFX. There are several ways to achieve something like this. One way would be to render the scene using a different camera, match the transformation matrix of both cameras to overlap the worlds, and use the render texture of the camera as the texture of the portal mesh (which is in this case the middle section of the parchment). This technique is nicely covered by Sebastian Lague here.
While the above way gives you a lot of freedom, it is more complicated to setup and has a heavier performance cost than the Stencil buffer. Ronja covers the stencil setup quite well, so have a look at this post if you don’t know anything about the topic.
A very short summary here. You can think of the process this way, in a pass, you render a mask on the screen using your portal mesh, dictating where on screen your portal world should be drawn. For example where your portal mesh is, will be white, everywhere else, black. Then when you are drawing your portal world, you only draw where this mask is white. And when you are drawing the world outside of the portal, you don’t draw where the mask is white.
For this, you don’t even need the stencil buffer. You can draw this mask on a separate pass on a separate render target before you draw your scene, and during the rendering bind it to your scene’s shaders. This way you can even do things like blurring the mask, to have soft borders. So why Stencil?
There are several issues with this. There is a cost associated with switching render target during your render loop. Sine you have to first render the mask in a different texture, you will be switching your render target one more time than nessecerly per frame. On top of that, you need to keep an extra texture the size of your screen in memory. Also there is the overhead of invoking fragment shaher functions for the pixels that are masked out. You have your mask bound to your fragment shader, and even if at the very beginning of your fragment shader you check for whether the fragment should be drawn and discard it if not, the fragment shader still had to be executed.
Stencil buffers go around some of these issues. The Stencil values themselves are saved together in your depth buffer. So they are more memory efficient than a whole screen 32 bit texture. On top of that since the stencil buffer is integrated into your frame buffer, you don’t have to switch render targets mid loop. The stencil check is however performed after the fragment shaders are executed, so not only does it not help saving you on bandwidth and computation resources due to unneeded execution of the fragment shader, with very heavy fragment shaders, it might perform worse compared to discarding the fragment + mask as texture combination. The main reason why I decided to use stencil for this however, was its ease of setup.
Stencil Render Order Issues
We simply divide our scene in two, everything inside the portal, and everything outside the portal. Then we render the inside where the stencil mask has a certain value, and the outside where that isn’t the case. That leaves us with an issue. Any object which is part of the outside world and is between the portal and the camera also gets masked. This is an undesired behavior, since anything infront of the portal in the outside world should still be rendered.
There are different ways of solving this issue. Some involve a more complicated pre pass, where as you populate the stencil buffer, you take out the sections in front of the portal using a combination of stencil + depth operations. However I went with a very simple solution.
My render loop looks like this:
- Render the portal mesh with Stencil value 1 (depth and fragment write off)
- Render the portal world’s opaque shaders where the stencil buffer equals 1
- Render the portal world’s transparent shaders where the stencil buffer equals 1
- Render the portal mesh again, but this time render depth (fragment still turned off)
- Now render everything else opaque, no need to do stencil checks, since the value of the depth buffer stops you from rendering anything behind the portal’s surface
- Render the transparent objects, also no stencil checks
The above setup makes it so that you can render the scene below correctly.
You don’t need to do anything on the C# side to get this render order. You can use the Renderqueue tags in Unity within the shader to regulate the render order. You will start with Geometry-5 and for each of the steps decrement the “5“ one down.
For the ghost I modeled a simple model in Blender. In Unity, it has an additive shader, which blends it with the background. To make the eyes and mouth remain black, I marked those vertices with vertex color. You can also delete the faces, but I wanted to have the possibility of rendering them faintly if I needed too.
For their movement, I move them up in vertex shader based on time. As they move up, they get deformed based on a 3D simplex noise which is procedurally calculated using the vertex position (including the upward motion). This causes the deformation and the jelly fish motion you see.
A ghost’s life time has a certain period. At the end of the period I fade it out, then in C# reorient the object with a new local direction. This way I don’t have to create new ghosts and they keep appearing forever, every time in a new direction.
The candle flames are very similar. They also have an additive shader and are also deformed using simple sinus waves along time and space. The deformation happens on a base mesh which already has the shape of a flame. All the candle flames are drawn in one draw call, so instead of object space (they share the pivot), I use the uv space of the candles to drive the noise and deformation. Using the uv space, I reduce the amount of distortion closer to the base of the flame. The texture of the candles are also hand painted.
Mixing Procedural and Baked Animation
You might notice that as you move around the scene, the cat follows you around by rotating its head. This rotation is done in vertex shader and is procedural. Meanwhile the baked subtle animation of its body and ear movement is still happening.
To do this, I am again not using an IK rig. The rotation is done using a look at transformation matrix, which I cover in depth in my Look At Transformation Matrix in Vertex Shader. What I do want to cover here is how to mix the procedural and baked animations together.
The baked animation is done through the bones and the weights on the CPU side (or in GPU when the skinning is done in compute shaders). For each vertex, unity blends between the matrices of the bones that are associated with that vertex, based on the weights that have been either automatically calculate before in a 3D software or hand painted. By the time we get to rendering the mesh, the deformation is already done. So we can simply plug in our look at matrix after the local to world matrix of the object.
The fog that comes out of the portal world is rendered using a stack of transparent layers. I talked about this in more depth in my Interactive Volumetric Fog With Fluid Dynamics and Arbitrary Boundaries. I created the layers in Blender. The height of the layer is encoded in the vertex color and the UVs are laid out in a way, so that we can use them to fade out the fog as it gets closer to the borders of the mesh.
In order to correctly render the fog in the portal world, I split the mesh in two different draw calls. One for the fog in the inner part of the world, and one for the outside world. The synchronization between the two happens automatically, since the vertices on the edge share all vertex attributes.
The fog is calculated based on the same noise which the ghosts are deformed with. The noise is used to blend in the fog color and lighting. The scene on its own is unlit. However as you might notice, the fog is effected by the lighting of the candles and the moon.
The lighting information is again painted by hand in advance in Blender. The lighting texture represents the ambient contribution for the diffuse/ view independent lighting (lighting from all directions).
This texture is spatially bound. Meaning as the noise pans through the space, the lighting remains the same for any given area.
That was it for the break down. You can follow me on my twitter: IRCSS