top of page
Open Site Navigation

Making Dreamland.IRL

When we were asked - “can we make the Dreamland artwork run in real-time across a 15,000 pixel wide canvas” - the answer wasn’t immediately yes. Whilst we have a great deal of experience in building Unreal environments and controlling them from external sources, this would be the first time creating an Unreal Environment to run in real-time for a live performance and the first time we had pushed Unreal to run on a canvas 15,000pix wide.


We have previously used Unreal in two ways; firstly to make rendered content thus replacing traditional workflows such as Redshift and in virtual production, be that sports studios or more make believe environments. Whilst the workflow is similar for a live music show, there is an added pressure in knowing the environment will be on screen for over 4 hours continuously with 6000 people staring at it. With such a presence in the space, there wasn’t anywhere to hide with it suddenly dropping it’s frame rate or lagging.


Initially we set up a very simple draft scene with basic lighting to begin to benchmark the performance of the scene at the correct resolution - this was to see where we could find the right balance between high quality global illumination settings and performance.



The first step in preparing the final environment was to prepare the all of the 3D models to be used in real-time. For rendered usage, often the more polygons the better for super smooth results.

But in this instance we were looking for optimisation at every step to allow us the maximum amount of processor headroom to spend on lighting and global illumination. Initially the 3D scan of Dave’s head was made up of 282,000 polygons, whilst this would run in UE5 an unreasonable proportion of system resources would have been spent delivering this single object.

This would have been multiplied by all of the object in the scene. Working to reduce the triangle count of model, we reduced Dave’s head scan to 28,000 polygons. This involves finding the fine balance between it looking smooth and being real-time efficient.



Beyond establishing the general layout in Cinema4D the real work began in Unreal. The advancements with the new illumination system, Lumen, is what made the project work. Previously to achieve Global Illumination it would need to be calculated and baked - meaning that it can not be changed once set.


Dynamic lighting within a scene could be created by combining static and dynamic lights - but all of the indirect bounce lighting remained unchanged. Previously there would be ways of cheating this, by manually placing bounce lights at a lower intensity to give the effect of indirect bounce, but the results never felt realistic.


With Lumen, all lights can be 100% dynamic with global illumination calculated in real-time. This meant that all aspects of the virtual lights could be controlled from the lighting desk, and the neon signs featured in the scene dynamically contributed to the illumination of the world they were placed in. Light sources could strobe, change colour and move on cue - and instantly the bounce from those light sources would be calculated to fill in anything not directly lit.

The result of utilising lumen meant that we could recreate the same techniques used to created the original rendered artwork for the Dreamland album without compromising on the flexibility and overall effect of the scene.

Not being restricted to baked lighting completely transformed how we approached creating this scene.


With Lumen dynamic lighting.
With standard dynamic lighting

With over 110 elements from the scene being exposed to be controlled from the lighting desk, we had to set up the blueprint structure in a way that streamlined the onsite set up process.

This meant where possible - setting everything up to be controlled by changing the value between 0 and 1 - clamping much more varied values to all respond to this limited range. The intention behind this was to make programming easier - Alex would not need to refer to and endless list of suggested minimum and maximum values - everything across the project could remain consistent.


The Unreal project was worked on by multiple people at FRAY along with the team at 4Wall - a simple, organised and well labelled Blueprint layout was key to making the process and open and transparent as possible.




With the gig happening in Brooklyn and the FRAY team being in the UK - we used PlasticSCM as a way of working on the project. This meant that 4Wall could check out elements of the project in NY whilst we checked out and worked on other elements of the project back in the UK. Plastic would then merge all of the changes as each person checked their assets back in.



back
next
bottom of page