Adagio VR – Part 4: Animation & Sequencing

Welcome back! This is the final post in a four-part series about building a short, cinematic VR experience in Unreal Engine. Across the entire series, we went over the best practices for creating scenes in real-time engines and sharing some fun, tips, tricks and timesavers  that you can reuse later down the road in your own projects. Let’s jump back in one last time!

Creating a Level Sequence

The switch from pre-rendering cutscenes in games to rendering them in real-time inside the engine was a monumental one. What took hours to render before is now a real-time given, and entire industries were shaken by that change. What also resulted was the inception of multiple sequencing tools inside game engines to ease the creations of in-engine cutscenes. It’s a staple for any self-respecting game engine to have one now, from Unity’s Timeline to Unreal’s Sequencer.

To allow sequencing in an Unreal scene, it all starts with creating a Level Sequence asset from the Cinematics dropdown menu. This is essentially a sequence player for your scene that will play whichever sequence is fed into it.



Again, for anyone who has worked with video editing software before, things will be pretty familiar. You sequence can have tracks that can contain various assets types namely, audio clips, animated characters and blueprints you’d like to edit at runtime.

In the above picture, you’ll notice that we dragged in our Sky Sphere object, and we can now edit and animate the sun height variable that we exposed earlier. This process is as easy as keying the variable across the sequence.

To help up set the mood a little, we can also drag in our music and ambient clips into the sequence. Similarly to how we animated the sun height variable, we can fade out the music track by keying the volume. It’s good to know that while Unity takes in various audio file formats, Unreal only accepts WAV files.

This process is repeated once again our other scene components like our Skylight and our exponential height fog. Again, we merely have to stick the asset in the sequence, add which parameters we’d like to animate and start keying.

This basic keying process complete, we are now able to VR preview our sequence and witness the foggy dawn breaking over our rolling hills . We have three things left on our checklist:

  • Animating the ghost and her grave

  • Guiding the player through the scene

  • Enable the viewer to control scene playback

Animating the Ghost and Her Grave

Animating our ghost is done in two parts but isn’t really more complicated that what we’ve already been doing. We simply have to place an instance of our Ghost Skeletal Mesh in the scene, adjust her over the grave mesh then drag her into the sequencer. There, can just drag in her animation track and tweak her positioning.

If you recall the last post where we added Material Parameter Collections assets to our ghost and grave materials, now’s the time to use them. Material Parameter Collections essentially serve as parameter repositories that affect all materials that use them in a project; this is perfect to control the ethereal glow that emanates from both the grave and ghost. Inside the sequencer window, it’s possible to create a Material Collection Parameter track, and control the various parameters inside it via keying like we’ve been doing.

Guiding the Player Through the Scene

While all the assets are now completed and animated inside the scene, the player’s still not able to get around properly. There’s a few methods to implement movement in our scene, but we have to remain mindful of a few basic rules to ward off filmic VR motion sickness:

  • We can’t move too fast

  • We can’t make extremely sharp turns

  • We can’t make unannounced changes to the camera position

With these constraints in mind, we’ve chosen to slowly dolly the viewer along on a pre-set track throughout the scene. This is probably the most best method for our experience as placement through our scene is a crucial element of our narrative. To create our path, we’ll be using Unreal’s Spline Component, similar to more traditional spline tools like those found in Maya.


Now that we’ve locked down our movement scheme; it’s time to create our Player Blueprint. After right-clicking in your project and choosing to Add a new Blueprint class, you can choose which class your new blueprint inherits from.

This sounds complicated, but is essentially just a way to choose what “type” of blueprint you want. Blueprints (and Prefabs in Unity) are probably the most alien element in real-time engines for people coming from traditional DCCs, but essentially function as a collections of components held together by code. For our purposes, Blueprint “Pawns” are a nice baseline for player characters in Unreal applications and will thus serve as the baseline player object for our experience.

Inside the Blueprint editor we can add a myriad of components to shape how our final asset will function. We won’t need much for our experience; for now we’ll start with a camera. After clicking Compile & Save, we are now able to assign our new Player Blueprint as the Player Object for this level. Inside the World Settings panel, we can simply pick our BP_Player in the “Default Pawn Class” to do as such. Launching the experience and not being able to float around with the WASD keys will now indicate that we’ve done the switch correctly.

While you will again surely be better served by Unreal’s documentation if you’re looking for help with Blueprints, we can still do as we did earlier and share our process and implementation of the Player Blueprint. As we wrote earlier, all we need is to be able to move the camera along a spline component and control this movement via the sequencer. For this to work, we will need to create a separate Blueprint that will serve as our tracks for the experience.

After creating our aptly named BP_Rails, we can simply add the spline component to it, Compile & Save and begin to place our tracks in the viewport after placing it in the scene. For anyone who’s ever placed curves in either 2D or 3D software, the process will feel familiar. Keeping in mind the VR Motion Sickness rules, we have to be mindful of sudden turns or ramps in our tracks.

Our spline-work done, we jump into the Blueprint Event Graph and start creating the mechanics that will control our player’s location. Quite simple; the Blueprint finds the player when the scene launches and then proceeds to keep it locked to the rails according to the locationAlongSpline variable we then animate in the sequencer. As with the rest of it, we can simply drag the actor into the sequencer to add it to our sequence and then animate the locationAlongSpline to get the results we want.

Everything set up properly, we can press play and glide across the hills as dawn breaks and our sequence unfolds. Our experience is almost complete!

Enable the Viewer to Control Scene Playback

Even in experiences like these, it’s always good to have at least the bare minimum in agency: the ability to start, pause and play the experience as they wish. Considering that our entire experience unfolds at the whim of one level sequence; it becomes almost trivial to implement this sort of functionality.

This event chain (available here) is split in two parts. If no key has been pressed beforehand, we start the sequence; If a key is then pressed again, the sequence can then be paused or resumed. We make use of DoOnce and FlipFlop, two quite convenient flow control Blueprint nodes that allow us to add a lot of functionality to one event chain without having to break it up into many smaller, less efficient parts.

It’s then only a question of articulating these controls in the game view. We start by creating a small additive material which allows us to fade a texture in and out. We’re using our usual suspects here; Texture Parameter and Scalar Parameter nodes – as this material will act as a master shader which we can tweak in-game.

This material complete, we can then quickly jump into creating our final blueprint for this project: the “Screen” on which our “Press Any Key” prompt and our title will be projected inside the game. This small event graph uses the Timeline node to great extent, saving the time that would have been spend keying fade values in another DCC. Our event graph complete, we can fade in and out multiple UI textures on a plane inside the scene.

Back inside our BP_Rails, we can add our new blueprint as a variable so that all the moving parts are controlled inside one blueprint.

This process complete, we can now simply place our new blueprint in the scene at our starting location and adjust size as desired. Quite anticlimactically, we’re done!

In the grand scheme of things, this is a pretty small scene – there’s isn’t much going on aside from our ghost and the player movement. Still, it’s a good example of how scenes are broken down using real-time engines. See you soon in another tutorial!

Previous
Previous

Adagio VR – Part 3: Character Art, Lighting and FX

Next
Next

SIGGRAPH2019