• Register
Post news Report RSS Creating Quick Death Animations - Rapture Dev Log #1

Hello, my name is Eli Davis. I am the lead developer on Scry’s upcoming VR game Rapture. This will be the first of many development logs which aim to keep those interested up to date on the progress of the game. They will also provide brief tutorials and overviews of mechanics or features found in Rapture.

Posted by on

Hello, my name is Eli Davis. I am the lead developer on Scry’s upcoming VR game Rapture. This will be the first of many development logs which aim to keep those interested up to date on the progress of the game. They will also provide brief tutorials and overviews of mechanics or features found in Rapture.

This past week, I updated the death animation for when a skull is sliced in half by the player. To make an enemy more satisfying to kill, it is essential to reward the player with flashy feedback upon successfully cutting the enemy. Visual feedback is important in video games for letting the player know whether or not what they are doing is correct. Being creative with visual feedback becomes more important as you transition to a VR platform where traditional UI methods like Heads Up Displays (HUDs) become less common. With every object or detail brought into the game, careful discussion around exactly how a player will interact with each object evolves as other features are added. Their have been three iterations of the death effect already, and once we get our project working with Unity 2018’s SRP the death animation is sure to change again.

Death Animation Demo


It’s not uncommon for games to use lights on a character to indicate that it is currently alive. When a character dies, the light being emitted from a character fades or goes away completely. One example of this is in Bioshock, where a Big Daddy’s helmet light will turn off when it dies. Bioshock’s huge open world makes leaving a character’s corpse in one spot trivially affect the gameplay. However, this is not the situation with Rapture’s demo. The player is currently confined to the size of the play area set up in their home and is not allowed to teleport around the level’s map. This situation coupled with the fact that a player could end up slicing ten to twenty skulls in a matter of seconds requires that corpse cleanup be quick and not distract the player from other enemies in the scene.

We use coroutines, emission maps, and dissolve shaders to both indicate to the player that the enemy has died as well as remove the enemy from the scene. Using coroutines to modify the property values of both the emission map and dissolve shader over a one second time period creates a fluid animation for removing an enemy from the scene without having to clutter up the Monobehaviour Update Lifecycle Event. Using emission maps coupled with the bloom post-processing effect, we take what was once a brightly lit up skull and fade it into a dull husk. Finally, we switch the shader being used to dissolve away the remaining pieces of the skull. Coroutines are pretty straight forward and if you have never heard of them before, you can begin reading here. There are also plenty of tutorials for creating your own dissolve shader. The rest of this article will be dedicated to talking about modifying shader properties.

Emission Maps and High-Dynamic-Range (HDR)

material view


Unity’s standard shader has an emission map texture property for specifying what parts of your material will emit light, (self-illuminated) as well as a color property for indicating the color of the emitted light. Back to our Bioshock Big Daddy example, its emission map would be the circles to match the lights on it’s helmet. You notice when you aggro a Big Daddy it’s helmet's lights turn from yellow to red to indicate that it is in fight mode. This effect is easily achieved by simply changing the color property from a yellow to a red. No need to modify anything about the UV or emissions map.

The color values of the shader displayed in the editor, however, are not your normal RGB values, as they also include input to specify the “intensity” of the color. This is an aspect of HDR, where a pixel that emits light can take on RGB values greater than the standard 0..1 range. A pixel that emits a bunch of light might need that light to bleed over into other pixels, and that effect is achieved through HDR and post-processing methods like bloom.

Setting Up Bloom

Unity has provided a Post Processing Stack that can be downloaded from the asset store. To start adding effects to your scene you must right click in your project tab and go to Create > Post Processing Profile.

createpostprocfile

Once you select your profile you should see a sizable amount of effects are your disposal for changing the overall look of your level. Here we have enabled the bloom effect and left it at default settings.

createpostprocfile 1

Once we add the Post Processing Behavior script to the main camera to the scene and attach our newly created Post Processing Profile, the look of the skulls become drastically different.

without bloomSkulls using HDR without bloom


with bloomSkulls using HDR with Bloom


You can see in the images above that with a high-intensity value (both these images are using 2.6) other pixels not associated with the object begin taking on a reddish hue that was defined in the material. The end result is an intense glowing effect.

AN IMPORTANT SIDENOTE


You might notice that is an Antialiasing setting in the post-processing stack:

createpostprocfile 2

If you are doing effects like bloom, you must go into Unity's Quality settings and disable all anti-aliasing for all levels of quality because they no longer do anything. Sometimes anti-aliasing can occur at the hardware level like a graphics card, which is convenient until you are modifying pixel values after the image has left the card (how post processing got its name). This means that the hardware calculations are wasted and is why Unity has provided software based anti-aliasing solution in their stack. Unfortunately, because our target platform is Virtual Reality, we can't take advantage of "Temporal" techniques that result in higher quality and must rely on more simple solutions. Temporal techniques use the concept of motion vectors which takes into account where the image is moving to provide the high quality final image that it does. This concept breaks down when it comes to virtual reality as now the stack is having to deal with two cameras, and the stack coded for motion vectors of only one of the cameras. The final output becomes a blurry mess as the blending is applied to both the left and right eye incorrectly.

Modifying Emission At Runtime

Appropriately modifying the shader values programmatically to fade out the glow might seem a little odd without explanation, but first some code.

Material mat = gameObject.GetComponent<MeshRenderer>().material;
mat.SetColor("_EmissionColor", Color.red * Mathf.LinearToGammaSpace(x) * y);

For those who have never modified material properties at runtime, it’s common notation for references to shader variables be prefixed with an underscore, hence the “_EmissionColor”. The part that might seem weird is the Linear To Gamma Space multiplication with the color red. This math takes into account that a monitor’s display of light values is actually non-linear. Nvidia documentation sheds light on this as “a pixel at 50 percent intensity emits less than a quarter of the light as a pixel at 100 percent intensity”. Doing this calculation performs gamma correction, which is especially important for lighting.

The variable x is a value between 0 and 1 which indicated how much of our lighting has faded. However, when starting at a high color intensity the sudden drop below 1 might not make sense, so the final value in this code snippet is scaled by the value y. For i- game purposes my starting color intensity is 1.6 and my y value is 3.0 to achieve what I feel is an appropriate fade effect.

Conclusion

We have discussed techniques to procedurally animate changes to properties inside a character’s material to indicate they have died. The techniques applied to the color of the emission map can be applied to a dissolve shader’s alpha cutoff to have a fade-out effect. If you have any questions or would like me to go into more detail on any of the subjects, don’t be shy. Leave a comment or hit me up in Rapture’s Discord!

As a final note, please check out and follow Scry on Twitter or like us on Facebook. Every little bit of support helps.

Post a comment

Your comment will be anonymous unless you join the community. Or sign in with your social account: