In addition to fascinating gameplay, Housemarque is thought for excessive particle results. Titles reminiscent of Resogun, Alienation, Matterfall and Nex Machina all used proprietary VFX expertise to carry colourful explosions to the display so as to reward gamers for destroying enemies or finishing ranges. In Returnal, Housemarque switched from prime all the way down to a 3rd individual digital camera, but in addition to a extra grounded and darker artwork fashion than earlier than. On this article, Risto Jankkila, Lead VFX Artist, and Sharman Jagadeesan, Senior Graphics Programmer, give us a better have a look at how they utilized their VFX tech to make the alien planet of Atropos and its inhabitants come to life.
Beneath you will discover the total breakdown video masking among the showcase VFX options of Returnal. Along with that, we’ll go a bit extra into element with a few of these options on this article.
The historical past of our VFX tech
We now have been engaged on our proprietary VFX tech since Resogun (2013 PS4 launch title), the place the primary prototype of our present particle system was used on among the showcase results. After Resogun, the particle system acquired a graphical person interface and we began referring to it as Subsequent Gen Particle System (NGP). In 2014 we made the choice to provide all the particle results for Alienation with NGP. After delivery Alienation, the system was used for Nex Machina and ported to Unreal Engine for Matterfall.
NGP is designed to be a GPU-only VFX authoring system with minimal CPU overhead. Focus is on good efficiency and suppleness. Particle authoring is finished by VFX artists who write compute shader snippets that outline particle behaviour and information. NGP takes care of reminiscence allocation and many of the boilerplate code, whereas artists can give attention to behaviour and visuals.
Presently NGP is just not meant to deal with solely particle results. It will also be used for controlling voxel behaviour in volumes or for producing dynamic procedural geometry. We even have a number of modules that generate information for use as an enter for results.
For instance, we’ve got our personal fluid simulation module that may feed its simulation information to NGP. One other instance is a module known as voxeliser which might convert an animated mesh to voxels. That information will be then used for volumetric character results. Different sources like textures, bone matrices and vertex buffers will also be used as inputs for particle results.
The VFX magic behind enemy tentacles and bullet trails: node particles
From early on in Returnal’s improvement, it was clear that we needed to do one thing particular with enemy creatures on Atropos. Sport director Harry Krueger needed them to resemble deep-sea creatures with properties like bioluminescence and tentacles.
Our enemy staff animators briefly experimented doing tentacles utilizing conventional inflexible physique physics to simulate chains of bones connected to enemy skeletons. This method appeared a bit of too restricted because the efficiency value of doing a number of very lengthy chains was too excessive, but in addition as a result of we lacked technique of expressing enemy states through physics simulations solely. VFX was then assigned the duty to create dynamic tentacles that may very well be connected to enemy meshes and skeletons.
Fortunately we already had an answer in thoughts. The staff had been experimenting with particle vegetation for earlier tasks and a particular particle kind had been developed for branching vegetation reminiscent of timber. We named this particle kind “Node Particle” to replicate its properties and habits.
This particle kind permits us to create one-directional, parent-child connections. A particle could be a father or mother to a number of youngsters however a particle can have just one father or mother. When studying the father or mother particle the father or mother information is one body outdated, i.e. not the info that’s being written to within the present body. This makes it unattainable to “comply with” the father or mother strictly and ends in a facet impact which makes the movement of the particles seem “natural”. This facet impact is used extensively in particle results in Returnal, and it was particularly helpful for issues like tentacles.
However earlier than we began engaged on the tentacle habits, we wanted to resolve how we might render them. First, we experimented with rendering flat strips of polygons. The standard was near acceptable, however missing in some areas like shadows. After some time, we settled on rendering the tentacles as cylindrical meshes that have been constructed from NGP throughout runtime.
After deciding on tube rendering we might begin specializing in the habits of the tentacles. Having the ability to management particle habits, we have been now not constrained by simply physics simulations however might conveniently change the tentacle motion based mostly on the state of the enemy. This made it straightforward for us to experiment with issues like forcing the tentacles to maneuver in a sure method when the enemy is getting ready for an assault. We iterated on the timings with the enemy staff and designers to make sure that the tentacle habits would assist in telegraphing enemy states together with animation and different VFX.
Node particles additionally got here in helpful for the quite a few ribbons and trails we’ve got in recreation. We needed among the homing bullets to have an extended path that will linger on the display for some time behind the bullet. Additionally enemy melee assault trails used node particles. Beneath you’ll be able to see a video of node particles following their father or mother, making a ribbon path, adopted by a homing bullet assault by Phrike.
One in all our key ideas at Housemarque concerning visible results is to simulate as a lot as attainable throughout runtime, utilizing as little pre-baked information as attainable. As we had used fluid simulations in our earlier titles like Alienation and Matterfall, it was clear to us from the start that we shouldn’t accept pre-baked velocity fields for Returnal.
As an alternative we use an actual time fluid simulation across the participant to simulate air stream that impacts motion of the particles, vegetation and different VFX components. Along with that simulation (which we confer with as International Fluid Simulation), we will have extra simulations connected to completely different actors within the recreation.
Any gameplay occasion will be scripted so as to add forces to the fluid simulation inflicting close by VFX components to react. For instance, these forces will be included into enemy animations in order that when an enemy lands a soar assault, we add a radial impulse to the fluid simulation in that second and at that location. This causes close by particles like leaves or sparks to fly away from the affect level. Within the video beneath you’ll be able to see fluid impulses triggered from enemy animations and participant actions affecting particle vegetation.
Whereas utilizing solely fluid velocities was sufficient for issues like vegetation, in circumstances the place one can see discrete level particles, it felt that the worldwide fluid simulation velocity subject was missing element. To get extra element we selected to implement non-compulsory vorticity calculations for the fluid simulation and within the particle replace added a noise subject to particle velocity, proportional to the magnitude of the fluid velocity on the location of the particle. In-game this method was utilized for Xeno-archive holograms and within the participant teleporting impact, which you’ll be able to see beneath.
Voxeliser and volumetric results
One of many setting components we needed to have within the opening biome of Returnal (Overgrown Ruins) was thick, volumetric graveyard-like fog. As a result of top variations in our ranges, the procedural placement of the fog turned out to be problematic. As an alternative, we determined to position the fog volumes manually. With a excessive variety of volumes to be positioned by the setting staff, we needed to make the method as simple as attainable.
The flexibleness of our particle system allowed us to assemble these volumes in NGP. Since particle information and habits will be fully custom-made, we will retailer a 3 dimensional index for a lot of particles and have them signify a quantity. Quantity bounds will be handed as fixed information from CPU to NGP. Together with the three dimensional index, we will retailer every other information per voxel as nicely. This offers us the likelihood to retailer completely different states for every voxel inside a quantity. Along with with the ability to retailer voxel states, we will additionally change their replace logic based mostly on their place within the recreation world or inside the quantity.
Having voxels which can be conscious of their state and place, we might have them routinely emit extra density close to surfaces like flooring and partitions but in addition fade out easily close to edges of the volumes. This made the method of putting fog volumes rather a lot quicker, because the fog tailored routinely to its environment. We might additionally pattern the worldwide fluid simulation on the voxel place, and have the fog be moved round by issues like wind, bullets and participant actions. Within the video beneath, you’ll be able to see one in all these NGP fog-volumes positioned right into a stage. The fog density is adaptively created solely close to surfaces, and advected by fluid simulation in recreation.
Placing all of it collectively to create that Phrike boss battle
For the Phrike boss encounter, we needed to have the ability to emit volumetric fog from Phrike’s skeletal mesh. This posed an issue since volumetric fog and skeletal meshes are constructed from completely different sorts of components. Skeletal mesh is a set of vertices, animated factors in 3D area which can be ordered to type triangles that may be rendered. Vertices will be positioned arbitrarily to type every kind of various shapes from timber to humanoids. Volumetric fog however makes use of packing containers which can be known as volumes. These volumes include smaller components known as voxels. In distinction to skeletal meshes, fog volumes in Unreal Engine are at all times formed like packing containers and so are the voxels they’re constructed from.
If we needed to make use of Phrike’s vertices to emit volumetric fog, we must discover out wherein voxel of the fog quantity every vertex of the skeletal mesh is in. This was trivial to resolve, however the larger drawback was the truth that we might solely deal with one vertex per voxel. Having two vertices occupying the identical voxel would result in undefined behaviour and presumably crash the sport. To make issues worse, it was much more more likely to have a number of vertices inside a single voxel than it was to have only one.
The answer to this was a real-time voxelizer. The voxelizer takes a skeletal mesh as an enter and outputs a quantity the place every voxel that’s contained in the skeletal mesh is marked as occupied. This made the method of emitting fog easy as we solely needed to test the voxelizer output and see if the given voxel is marked as occupied or not. Within the video beneath you’ll be able to see the output of the voxelizer utilizing Phrike’s mesh as an enter.
With the power to emit fog from Phrike’s mesh, we have been in a position to mix the character higher with the setting when she was shifting. It additionally made Phrike’s particular actions like teleporting and spawning simpler to execute since we might disguise these transitions with the fog. Beneath you will discover a comparability video of Phrike’s spawn sequence with volumetric fog and different results and with out them.
This concludes our deep dive into the visible results of Returnal. We hope that you just loved studying this and want to share extra of our methods and methods sooner or later.