Labs project - NOAH - a first person puzzle adventure game (Part 2 of 2)

Part 2 of our Noah blog focuses on the programming challenges we encountered during development. Within this post we’ve broken each of the challenges into various sections focusing on a particular theme.

From the beginning, we knew that we wanted to create a world without any loading screen interruptions between areas. We also knew that we had to split the world into multiple scenes to make working concurrently easier to manage as well as make it easier to manage performance. So, using Unity’s additive scene loading capabilities, we began to look into ways that would allow us to create what we wanted.

As the general layout of the world started to take shape, we realised we had a small issue. Rotating each section to fit behind the the doors of the vault made further modifications to them difficult. To get around this, we decided to think of a system that would allow us to connect each section to the vault, without having to change its rotation.

Our first idea to tackle this problem was portals. Portals come in pairs, and looking through one would show you what was in the view of the other portal. Once you walked into a portal you would be transported to the location that was displayed in it, allowing you to move between areas that weren’t connected.

After some testing, we came to the decision that this method had a few issues that we wouldn’t be able to solve in the timeframe, so we had to come up with something different. Luckily, the logic we used for the transportation part of the portal system could be used, with minor modifications in our second and final solution. We decided to call this the 'airlock' system.

Behind each door in the vault is a corridor, and at the start of each puzzle section is an identical corridor. As soon as the player enters one of these corridors, their position and orientation relative to the corridor is tracked and stored and the area they are attempting to reach starts loading. Once the loading is complete, we use the stored local data to calculate the world position of the player moving the player to that position in the newly loaded area, creating a seamless transition between the two, even if the areas don’t line up.

Player Controller Customisations

We knew that we wouldn’t have time to write a completely custom first person player controller, so we decided to use the one found in Unity’s standard assets as a base. After stripping out the parts that we didn’t need like jumping functions and references to other parts of the standard assets package, we began looking into the new features that would be required by the game.

Firstly, we wanted moving platforms. These platforms would need to be able to move in almost any direction without limiting the player movement. By default, the first person controller does not take the movement of the surface it is currently standing on into account. This means platform would slide from underneath players if moving horizontally or cause the player’s view to jitter noticeably if moving vertically.

In order to solve this, we needed to calculate the velocity of the platform and add that velocity to the position of the player. Making sure this happens at the correct time during every frame is important. Calculating the velocity at the wrong time could give us an incorrect result, causing issues similar to when this system wasn’t in place. After a few iterations, we found a timing that allowed for smooth movement in all cases we tested for.

After we had working platforms to use, we realised that using the types of platforms we wanted to use would make it easy for the player to fall off a moving platform, which would stop them from progressing. After weighing up our options, we decided to create a system that would stop the player from falling from any ledge we didn’t intend them to fall from without manually modifying the environment’s collision data.

We did this by calculating the new position of the player every frame and checking to see if this new position is within a certain range of the ground. If it is, the move is a valid one and the player is moved to that position. If not, we estimate the positions the player will be in if they continue travelling at their current speed. If any of those positions are within range of the ground we move to the first calculated position. Else we discard all positions and the player does not move. We perform the extra checks to allow players to move over small gaps within the environment without issue.

Editor Tools

With every project we do using Unity, we create tools or code snippets that aid our workflow or just help us get around issues that we come up against. This project was no exception.

One of the first we made was an autosave system for the Unity Editor. By default, backups for all currently loaded scenes are stored whenever the play button is pressed. These backups aren’t well named (they are just given numbers), only one was stored per scene, could be infrequent, get deleted when a project is loaded and if the editor crashes shortly after hitting play, can be empty. That isn’t much use.

We decided to create a system, similar to the way Unreal’s editor works, which saves a backup of all open scenes at intervals set by the user. It also stores multiple backups per scene, the number of which is also set by the user, with the most recently created backup replacing the oldest backup when the user set limit is reached. These backup files are stored outside of Unity’s asset folder which let us replace scene files easily, even when Unity crashed (which it would do while we were away from our desks at one point).

Lightmap baking is the process of calculating all the lighting information within a scene. It affects how every part of the environment reacts to the lighting present. Performing these calculations can be a time consuming process taking anything from a few minutes to a few hours for a high quality result in a complex scene. The PC running these calculation is almost unusable during this process, using almost all the CPU resources it has available to it, obviously slowing our workflow considerably if completed during studio hours.

The best idea was to run this process at night but Unity does not have a built in way to perform this operation on multiple scene sequentially, so we wrote one ourselves. The tool is a simple window. With a button for each scene that will be present in the final game. Select which scenes you want to run, press a button to start the process it and leave it to do its work. We used this mostly for the final, high quality version of the bake as they took the most time. But we used far lower settings during the day so we could get a general idea of the look of an area while we were working on it.

Graphic Programming

Shaders

Our aim for the demo was to push the visual quality as far as we could within the time frame we had. We knew from the outset that we wanted to play with light and shadow and to achieve this we needed some advanced shader work and visual effects. We decided that we needed a quick way to visualise the programming of each shader so we needed a tool to assist us, after evaluating various visual shader programmers on the asset store we decided upon Amplify as a tool of choice, we share some of our experiences with using Amplify.

The pulse effect was a hugely important ability within the game as it is the main form of interaction. We need to illustrate to the player that the pulse ability is more of a form of energy that can be transferred to certain materials and surfaces. The pulse ability is also a radial effects, emitting from the body of the player, this was another important characteristic that we needed to capture within the visual effect design.

With the properties and behaviour of the pulse ability in mind we set about creating a shader to illustrate this to the player. Given that fact that pulse is energy, light emission plays an important role, we purposely kept the environment dark to accentuate the emission from the pulse effect. The final shader needed to display on all surfaces so we ended up creating our own Standard Shader that could be used for all surfaces. This shader included everything our artist needed to achieve the surface look they wanted whilst enabling the pulse effect to correctly emit across that surface. Our Standard Shader also took advantage of PBS.

Standard Shader for all surface

We also needed the pulse effect to emit onto any object and surface without the need for our artist to UV unwrap each of those surfaces, we just didn’t have the time for such a task. So to achieve this we implemented Triplanar shading, allowing the pulse effect to display on the surface no matter what the UVs were like or where it was within the world space. This mostly gave us the desired result without sacrificing too much time, however with a little more time we could certainly improve the blending between the normals of the Triplanar shading to better align the pulse pattern across each surface.

Standard Shader for all surface Standard Shader for all surface

Game team hard at work Preview of the pulse shader on the floor and surfaces around

Authors: Caius Eugene & Liam Duncan