How many hours have we all spent waiting for things to render? Watching frames slowly count upwards. It is like watching digital paint dry. We decided to try something different. Over 6 parts - we are going to look at what happened when we decided to break up with After Effects and embrace a new real time workflow for ‘Back To The Future The Musical.’
THE WAITING GAME
2D assets created in Notch.
We’ve been using After Effects since we started working with video. It has always been there. It is the Swiss army knife of the video designer. We never considered there to be an alternative and to be very honest, had not sought one until recently. We had our set of plugins, they did what they did, and the work started to look a little too much the same. We had become complacent.
We were too used to telling a director or fellow designer that it would be tomorrow before we could have that new idea rendered. We were very used to not being able to preview our work in real-time despite After Effect's best efforts. We learned how to use Dead-Line: lashing together a lot of computers in to render farms to speed things up, which helped a little but it was fraught with annoying problems.
And this was our professional world, always watching digital paint dry while trying to make exciting things happen on stage. The piece of software at the core of our workflow was not up to the job. If it has been a media server, it would have been out the door, and a new one found. But we never asked the same question of After Effects. It was the status quo. And everyone at FRAY was very bored with the status quo.
IN STEPS NOTCH
2.5D preset created in Notch. Used as a live block to endlessly generate in random sequences, as well as using system data to drive the current time, date, location, etc.
We have been increasingly using Notch for creating generative content within media servers such as Disguise but had not used it a great deal for making stand-alone video content. Our growing disillusion with After Effects coincided with us speaking at a conference where Matt Swoboda of Notch was also presenting. His message was simple, Notch is a tool designed for live-video and After Effects is not, so why use it? Initially, this sounded like heresy, but the more we thought, the idea the more it made total sense. There had to be something better out there.
The thought of breaking up with After Effects, like ending any long term relationship was a scary one, but we knew it was time to move on. We had a large project in the shape of ‘Back to the Future the Musical’ on the horizon which we already knew we would be using some generative content on so why not go the whole way? We set ourselves the rule we will start and finish everything in Notch. We would only resort to After Effect or Cinema4D renders if we hit a total roadblock.
Overhauling our tried and tested studio system workflow was an exciting challenge. How do we structure projects, how do you do something in Notch you do automatically in AE, how do you make your work accessible to others, is it rendered or real-time and many other questions?
RENDER OR REAL-TIME?
2.5D asset created in Notch. Run as a live block and entirely controlled through a single exposed parameter.
First and foremost, the rendered or real-time block question needed answering as the approach you take to making the content is very different. We decided on a few basic principals to guide us; if the video content needed to endlessly generate something like a sky, it would be a block. Variables that would change day by day would be a block or moments where animation needed to seamlessly respond to moments on stage not set by time code would be a block. If the moment was locked to time code or did not need to do anything responsive, it would be rendered.
Building a block for real-time playback in the media server throws up a lot of challenges from the Notch and the Media Server perspective. We spent a lot of time looking at the performance tab. Juggling between one significant smart node or several smaller, simpler nodes to find the correct balance of performance while maintaining a vibrant and exciting look. Furthermore, the generative content might be mixed with rendered content while the media server is processing automation data, dmx tables and any of the myriad of thing they can do – all meaning even less power to deliver your blocks.
Once you have your blocks balanced with the performance the number of blocks in our Disguise project also needed careful consideration. More blocks mean longer load times for your project and complex blocks add to this time. Ultimately, it is just a computer processing the block, Disguise or any other media server, it will crash at some point. You will have to restart, and if you have several servers as we did, you could be looking at upwards of 15 minutes to re-boot the whole system. The balance of what we were gaining on stage from running generative content versus the impact on our system needed constant consideration. It is easy to think generative content is the silver bullet and you will never need to wait for a computer again, but this is not the case. A computer can only be taken so far before it falls over.
To balance the impact on our system, we rendered a lot of content out of Notch. And we rendered it so quickly! Not always in real-time but compared to packaging it up, sending it to a render farm in South Korea, waiting for it to render, downloading on a dodgy internet connection only to find a randomly flickering light and starting the process again it was like moving at light speed. And once you have everything in your project turned on, if it doesn't play in 100% real-time, it was still doing far better than After Effects ever did. Even if we had to wait 15 minutes for a sequence to render, we were saving a ton of time. And once a 3D scene was rendered it was ready for the stage, it didn't need a final pass through After Effects to polish it up. It was a small 3D revolution.
Left - Development still rendered using Cinema4D Physical Render, each frame took on average 265 seconds to render. Right - Refined production still rendered using Notch, each frame took on average 6 seconds to render. On the same machine, to render 600 frames Cinema4D would have taken 44 hours to complete the render, Notch took less than 1 hour to complete.
This is not to say adapting our workflow was flawless with no moments of frustration. There was also a very steep learning curve for the studio. 40 years of collective Adobe experience and logic needed to be unlearned and new logics ingrained in our brains. For the 3D workflow, it was a marked improvement, but for 2D and 2.5D it was more challenging as you are working in an environment designed for 3D. Trying to create 2D composited sequences could feel like you were trying to push the software to do something it didn’t want to do. You end up with workarounds and would occasionally find your self wondering if it would just be easier to fire up After Effects.
Notch is also not great at editing a sequence together in terms of laying out a series of pre-compositions on a timeline. The timeline is not as clean and intuitive as After Effects but After Effects have 20+ years to perfect this so we could forgive the shortcomings. In the past few releases of Notch you can see the software progressing rapidly. What has actually changed in After Effects recently? Very little of note. The major advantage which out weighs all the frustrations? The near instant render times. Hours of productivity regained as you no longer had to wait for an eternity to see the result, it was all time that you could spend creatively, for the benefit of the show.
TIME TO MOVE ON
Changing our workflow on this project permanently turned a light bulb on for us. Yes, Notch is not perfect, but nor is the tools we are all used to. In the end, we landed at around 75-80% of the show being made within Notch. Lack of long term experience and familiarity partly accounts for our falling short of 100% - but there are also some limitations we just couldn’t overcome. But we are ok with this, it was an experiment that had very positive results. We achieved far more than we had hoped and genuinely felt liberated from the old way of working. We made a significant step towards a faster and more intuitive work flow based around allowing creativity to come to the forefront and not be held back by technology.
This is a brave new world of real-time workflow, and while it has come leaps and bounds in recent times, it is far from a finished project. There is tremendous promise there, and as a creative community, we need to embrace these tools to help develop them together to make software truly fitting for the live-video community.
Director: John Rando
System Design: Ammonite - Jonathon Lyle
Choreographer: Chris Bailey
Programmer: Emily Malone
Set Design: Tim Hatley
Video Engineer: James Craxton
Lighting: Hugh Vanstone & Tim Lutkin
Principal Animators: Adam Young, Norvydas Genys
Video Design: Finn Ross
Animation: Henrique Ghersi
Sound Design: Gareth Owen
Animation development: Laura Perret
Video Assistant: Kira O’Brien
Video Technicians: Neil McDowell Smith, Matt Somerville
Video No1: Ollie Hancock
Video No2: Piers Illing