The footage shown in this reel has been selected from a few of my most recent projects. These sequences were created for 8K IMAX 15/70 film as well as IMAX 3D digital laser, dome and television releases.
I have included a text overlay to highlight any aspects of these shots that were completed by others. Unless stated below, the work involved in the creation of these sequences, from concept and scene development to compositing and post production, is my own.Alternative YouTube Link
Between 2012 and 2018 I managed the development of CGI content for giant screen productions. This involved leading a team of animators through the creation of stereographic sequences for 8K IMAX 15/70 film, 3D digital laser and dome formats. This content has also been mastered for traditional theatres, as well as television and Netflix release. My primary focus during this period has been delivering between 20-25min of rendered stereo content for each film. I have also been responsible for CGI scheduling and documentation.
From 2004 to 2018 my base of operations was the Center for Astrophysics and Supercomputing. Here I was involved with the day-to-day management of 3 generations of render farm used across 3 feature length IMAX productions and 9 short films.
My role extended to offline post-production supervisor for The Search for Life in Space and The Story of Earth. Here I worked on the stereoscopic post-processing of live-action content and the conforming of 3D and 2D cinema deliverables. This involved alignment, colour matching, shot clean-up and colour correction. My duties also involved 3D tracking, VFX creation and compositing, sky extensions, object removal, stabilisation and final project conforming for deliverables. I have been heavily involved with developing and managing a pipeline for both traditional film and current generation Laser formats.
A December Media film produced in association with Biopixel, Soundfirm, Film Victoria and Screen Queensland. Narrated by Eric Bana. Distributed by MacGillivray Freeman Films.
The Story of Earth film is a documentary for giant screen and IMAX theatres directed by Russell Scott and written by Russell Scott and Wain Fimeri. Narrated by Rachel Ward, the film centres on how contemporary geology has potentially led to a new understanding of how life on Earth came to be.
The Search for Life in Space is a December Media film produced in association with Film Victoria Australia and Swinburne University of Technology. Narrated by Malcom Mcdowell, the film is distributed by MacGillivray Freeman Films.
Narrated by Golden Globe winner Miranda Richardson, Hidden Universe is a December Cinema Productions film produced in association with Film Victoria, Swinburne University of Technology, and the European Southern Observatory (ESO). Executive produced by Emmy-award-winning producer Tony Wright and George Adams and James Vernon. Produced in association with MacGillivray Freeman Films. Produced by Stephen Amezdroz and written and directed by Russell Scott.
Telescope is a stereoscopic film detailing the evolution of astronomy. Narrated by Dr. Alan Duffy, Produced by the Swinburne Centre for Astrophysics and Supercomputing.
Mars brings Martian data to life for the ACMI Star Voyager film exhibition. Produced by the Swinburne Centre for Astrophysics and Supercomputing.
At the conclusion of my first year of University I was fortunate to receive a placement in The Centre for Astrophysics and Supercomputing's Vacaction Scholarship program. This program exposed animation and film making students to practical production experience. As I completed my studies, I continued to work as a Virtual Reality Animator working on a range of projects. In 2010 I took on a management position directing several internal and external client projects, as well as working with students participating in various university and high-school work experience programs.
Bachelors Degree in Multimedia (Major: Media Studies) - 4 years
(3D & 2D Digital Animation, Film Production, Motion Graphics, Creative Writing, Web Development, Media Theory)
Swinburne University of Technology
Recently Completed Training / Courses:
Applied Houdini - Rigids I - Fundamentals
Applied Houdini - Rigids II - Structure Destruction
Applied Houdini - Particles I - Fundamentals
Applied Houdini - Particles II - Velocity Fields
Applied Houdini - Volumes I - Fundamentals
Applied Houdini - Volumes II - Simulation
Applied Houdini - Volumes III - Rendering
Applied Houdini - Volumes IV - Interaction
Applied Houdini - Volumes V - Combustion Simulation
Applied Houdini - Volumes VI - Combustion Rendering
Houdini Fast Track Vol 1 - Fundamentals
Houdini Essential Training
Houdini Intermediate Ocean FX
Learning Maya After Knowing 3ds Max
Houdini has replaced most add-on plugins that extended my dynamics toolset in other packages. I have primarily focused on Houdini's pyroclastic simulation, particle advection, fields and rigid body destruction. As a generalist, I appreciate the power of parametric modelling as well as the extended direct modelling toolset incorporated into Houdini 16.5. I have primarily been rendering with Mantra, however utilize Redshift GPU rendering for a fast iterative workflow when possible. I have ported several scripts that simplified the stereo pipeline over to Houdini as Python shelf tools and digital assets. I currently lean towards completing destructive direct modelling tasks in Maya.
For most complex direct modelling and UV unwrapping tasks I currently focus on using Maya. Whilst I maintain a preference for 3DsMax's modifier stack approach to certain tasks, the speed afforded by Maya's marking menu system as well as recent updates to modelling and UV tools have proven to be valuable. I primarily use Redshift in Maya for rendering.
I have been using Max since 2003. It has been the the primary 3D package used in the large scale productions I have been involved with between 2004 and 2017. I have worked with several pipeline developers and artists that have come from major Australian based VFX studios. As a result, many pipeline tools, scripts and practices have been incorporated into my workflow over the years. Most notable of these are Render Pass Manager (RPM), VRay, Phoenix FD and Forest Pack Pro). Max has also served as a great platform for developing my own Maxscript tools to automate many of the repetitive or complex tasks unique to stereoscopic film production.
I incorporate ZBrush into my workflow wherever possible as the sculpting experience for concept work or organic asset development is excellent. Houdini and Maya's re-topology tools are both excellent and pair well with my ZBrush workflow. Zbrush has been invaluable for Nebula and Landscape sculpting, but has also been utilized for creature design and basic polypainting.
Substance Painter has become as a very useful tool for creating complex texture sets. The UDIM workflow has improved to a point that large screen digital assets can be textured to a high standard using substance's 'geometry aware' set of filters. The addition of Sparse Virual Textures to handle many high resolution UDIM channels has further encouraged me to favour Substance Painter's creative workflow.
Blackmagic Fusion (previously Eyeon Fusion) has been my compositor of choice since 2012. Although now incorporated into Davinci Resolve, the standalone Fusion product continues to serve me well as a fast and clean node based compositor. I have created several Python and Lua scripts to simplify or extend Fusion's native stereo workflow.
I was introduced to Davinci Resolve as a suitable non-linear editor to use for conforming giant screen stereo productions. Resolve has proven to have a excellent stereo workflow toolset across 3 IMAX productions. The functionality as an editing, colour processing and delivery platform has proven to be very useful. I continue to use Resolve for delivering projects in many different formats including IMAX Laser, IMAX 15/70 film and DCI spec digital cinema. Across multiple large scale projects I have developed a strong practical understanding of managing colour for CGI and Live Action footage. This extends to current RED Helium 8K footage processed with a IPP2 or ACES workflow.
With access to large CPU only render farms provided by the Centre for Astrophysics and Supercomputing, VRay proved itself as a great render package. Working as a freelance artist, I have found Redshift to be a great GPU accelerated alternative. Although I appreciate the native Houdini effects support offered by Mantra, I have found it to be a worthwhile exercise to adapt Houdini scenes for rendering with Redshift.
I transitioned from Mental Ray to VRay (1.4) in 2005. I continued to almost exclusively use VRay until 2018 (3.6). 8K 3D IMAX frames are 8192x6144px per eye at 24fps, as a result I have become very familiar with the best workflow for optimizing large scenes for efficient render passes. VRay has been the primary render installed across render farms I have managed.
After a period using Fume FX and Krakatoa for space based simulations (volumetric nebula, galaxies etc.) I transitioned to the overhauled version of Phoenix FD (v3.0) and was extremely impressed with the artist friendly workflow and much improved voxel and flip based simulation engines. Phoenix quickly transitioned from a tool intended for a single shot, to being used in some way on almost every sequence. I have provided feedback across nightly builds of Phoenix between 2014-2018 and seen several requested workflow enhancements implemented into the plugin.
Adobe Photoshop has been at the core of much of my graphics, painting, texture and storyboard work since 2003.
Adobe After Effects was my compositing package of choice pre-2012 and was used on the IMAX project Hidden Universe. Whilst I now prefer node-based compositing (especially for stereo work), I continue to use After Effects for developing motion graphics and 2D animation. I also find After Effects to be an efficient option for processing frames across non GPU equipped render clusters as a command line tool.
The first IMAX project I was involved with, Hidden Universe (2012), was conformed in Premiere Pro. Several short films prior to this were also edited and conformed using Premiere. Whilst I consider myself to be very familiar with the package, I find Davinci Resolve has replaced Premiere where GPU accelerated NLE is an option.
Whilst Muster was the render manager deployed across the render farm at The Center for Astrophysics and Supercomputing, I exclusively use Deadline across my personal render cluster. I have a functional understanding of how to set up and configure a render environment with submission tools for Autodesk, Blackmagic and Adobe products. I also have experience setting up custom sanity checks specific to my current pipeline.
I have managed 3 render farms over 7 years that use Muster as the primary dispatch and management system. Whilst I do not choose to use Muster currently, It did offer a deep level of management. Custom submission scripts helped to streamline a stereo rendering workflow not native to Muster.
I have experience using Vue X-Stream with 3DsMax and Vue as a standalone product. I currently prefer to use Forest Pack Pro in 3DsMax, Houdini Terrain Tools, or Maya MASH networks to develop large instanced landscapes. I consider Vue to be a useful tool for developing static terrain matte paintings and Ozone driven sky plates when HDRI is not suitable.
Productivity Scripts / Tools / Techniques
3ds Max Stereo Settings Calculator
Generates optimal stereo camera settings based on scene & camera settings.
This maxscript uses your current render settings along with the current active camera view to generate stereo settings based on the desired percentage of screen space. Near and Far guides are generated that represent safe limits for the supplied target stereo strength.
Fusion Timestamp AutoBackup
Saves a timestamped copy of your .comp to an AUTOBACKUP folder.
This script will append your .comp name with the current date and time and then store it in your preferred AUTOBACKUP location. The scene is also saved as normal without incrementing your current version, so you can replace the default SAVE with this script to maintain a safe version history with each save.
Fusion Increment File Name & Sync Saver Version
Makes sure a incremented file version is synced with your current output version.
Similar to the Incremental Save, this script also syncs the version of any saver output paths. Useful if your workflow requires the version of a composition to be matched with the version of an export.
Fusion Toggle Combiners or Splitters
Enables or disables all splitters or combiners depending on their current state.
Combiners and Splitter nodes can be used to control the flow of stereo pairs through a composition. Being able to toggle these on mass allows you to develop a stereo composition using the left eye only and then quickly toggle the flow to process both eyes.
Fusion Refresh Loaders
Reloads the source material for a selected loader.
As new renders come in, loaders often need to be refreshed to extend their in | out points. If durations don't change, frames may remain cached even when they no longer exist. Select the loader(s) and activate this script to fully refresh the input material.
Fusion Stereo Wiggle
Automatic animated switching of the A | B frame buffer channels.
Every program used for stereographic film production needs a stereo wiggle feature. Load the A frame buffer channel with the LEFT EYE and the B channel with the RIGHT EYE. This script uses th RAM cached image data to show you a animated 'wiggle' of the stereo pair. A handy tool for visualizing depth in a 2D/mono development environment.