Posts filed under Film


Several years ago, I created my own video playback software which has been heavily used on many of the movies I and my colleagues have worked on. While that software is still quite capable, as someone who has always been interested in new technologies, I’m always looking for new ways to push the boundaries on what we can do. Although I’ve been closely monitoring the progression of the Unity 3D game engine since version one, it wasn’t until fairly recently that the feature set advanced to the point of being a viable platform upon which I could create new playback software. Between the plugins available on the Unity store as well as custom ones I’ve built myself, I’ve been excited to be able to create ways of bringing the interactivity of on-set graphics to a whole new level.


When I was asked to do Sony’s futuristic sci-fi movie, Passengers, it seemed like a great opportunity to put what I’d been doing in Unity through its paces. While great actors, such as Chris Pratt, can make nearly anything they do believable even if all they’re interacting with is a green screen, it is my opinion that the more realistic and interactive we can make the technology they’re performing with on set, the easier it is for them to stay fully immersed in their character’s world. Which is why it was a big deal to me to overcome the challenges we were presented with on Passengers’ cafeteria set. We had to seamlessly tie together the actions of a tablet that was floating on glass with the 4K TV directly behind it. Traditionally, a scene like this would have been triggered remotely which requires choreography of the movements of the actor’s hands and the order in which they press the buttons on screen. It wouldn’t work well to have the actor pressing the left side of the screen if I’m triggering a button on the right.


To eliminate the need for any graphics puppeteering or choreography, I decided to use OSC (Open Sound Control) to send network commands from the tablet to control the computer playing the graphic on the TV screen. That way when Chris would interact with the tablet on set the graphics on the larger monitor would react automatically and, instead of having to remember any kind of choreography, allow him to focus on his character’s predicament of having minimal access to the food dispenser.

In the hibernation bay, there were twelve pods containing four tablets per pod plus backups which meant more than fifty tablets that needed to display vital signs for whichever character was in the pod. 

Since the extras were constantly shifting between pods, we had to have a way to quickly select the right name and information for that passenger. This was done by building a database of passenger information that could be accessed via a drop-down list on each tablet which let us reconfigure the room in just a few minutes.

Since Passengers takes place in a high-tech spaceship, tablets were embedded in the walls throughout the corridors and rooms. Everything from door panels to elevator buttons and each room’s environmental controls were displayed on touch screens.

Because of how the set was constructed, many of the tablets were inaccessible once mounted in place. We’d start by loading the required content before the tablets were mounted but we also needed to have a way to make modifications through the device itself if changes were necessary. 

The beauty of using a game engine is that it renders graphics in real time. Whenever color, text, sizing, positioning or speed needed to be changed, it could be done quickly and remotely either by using a game controller or interactively on the tablet itself. This kept us from causing the kind of delays in the shooting schedule that would have resulted if we’d had to rip devices out of the walls every time a change was made.

I learned a lot about Unity’s capabilities on this movie and I’m excited to continue exploring the boundaries of using game engines in ways they weren’t necessarily built for. This experience has allowed me to refine and expand upon what I thought was possible and I’m excited to use even more advanced versions of my playback software on future projects. That said, without amazing graphics to play back, the best software in the world still won’t get the job done. Chris Kieffer, Coplin LeBleu and Sal Palacios at Warner Bros. produce graphics that are second to none and I’m always grateful for the times I get to work alongside them. 

To read more, here is a link to my article in Local 695 Production Sound & Video Magazine.

Passengers photos courtesy of Columbia Pictures

Posted on January 7, 2017 and filed under Film, Video Playback.


When working on Interstellar, our team was brought in early on, in pre-production, to work with the art department on designs for the screens in the various ships. We were tasked with the goal of making a very utilitarian, functional, NASA style design. After much research into actual space shuttle control screens, we were able to use the real-life examples as a starting point.

Graphic designers Chris Kieffer, Coplin LeBleu, and Sal Palacios did a great job finding a visual middle ground between what would be futuristic space travel to the audience yet antiquated to the crew of the Endurance.

Although I was also involved in the design process, wrangling all of the various content and transforming it into interactive media that could be controlled seamlessly whether it appeared on monitors, tablets, or laptop computers was my specialty.

This film called for me to develop special software to remotely control an iPad mini which was built into a prop featuring fake buttons that needed to respond to Anne Hathaway's interaction.

Both the TARS and CASE robots each featured two additional iPad minis that also needed to be controlled remotely via this same software. Although I have since created more flexible tools for controlling devices, this was the first time we were able to utilize other iOS products to remotely trigger the iPads across a wireless network.

The crew of Interstellar was among the most talented group of people I've ever had the pleasure of working with and I feel very fortunate to have been a part of making this film.

Posted on February 5, 2016 and filed under Film, Video Playback.

Man of Steel

In the Daily Planet set for Man of Steel, it needed to look and feel like a lived in functional newspaper office.  We needed an easy way for our guys on set in Chicago to build multiple desktop computer screens for all the monitors on set.

Because of clearance issues, we couldn't use any actual existing operating system.  Chris Kieffer designed a custom UI for our fake OS.  I built a custom application for the film that allowed create multiple desktop layouts.  Using sets of PNG files we could now just select a background, icon set and add and position windows and save a different layout on each on set computer.

I was also on the production in the California unit as a playback operator for the green screen in the tibetan tent scene.  Since it was an old CRT Tube TV, I needed to synchronized it's refresh rate to the 24 frame film camera or there would be a rolling bar in the shot making it harder for the VFX guys to composite in post.

Because of the enormous amount of post production VFX that needed to happen on Man of Steel, I was asked to help out on the monitor replacement in the Daily Planet scenes.  

Move mouse to see before & after (touch on left & right of image on a touch screen device)

I used Nuke for the tracking and composites, and color matched in DaVinci Resolve for editorial screenings.

Move mouse to see before & after (touch on left & right of image on a touch screen device)

Posted on October 1, 2015 and filed under Visual FX, Video Playback, Film.

The Incredible Burt Wonderstone

On The Incredible Burt Wonderstone, I got involved in post production as a compositor for the TV screen replacement shots.

Move mouse to see before & after (touch on left & right of image on a touch screen device)

Using Nuke to do the composites, on some of the shots I needed to also replace the channel number so it could change in the shot.

I also needed to rebuild the news content for 4:3 aspect since the original content was in wide screen 16:9 aspect.

Move mouse to see before & after (touch on left & right of image on a touch screen device)

In the smoke covered stage shots I ended up using Mocha Pro to track the shots.  It gave me better results using it's planer tracker.  I then brought the tracking information into Nuke for the final composites.

Posted on May 10, 2015 and filed under Film, Visual FX.

Alex Cross

In the film Alex Cross, most of the playback on set was green screen to be replaced in post production. I designed and animated the graphics for the interactive mobile device where Alex Cross disables the Police Dept. Security System.

I was also brought on to do the VFX replacement composites of the green screen computer monitors in post production.

Using tracking markers in the green screen playback files on set made tracking the shots easier.

Move mouse to see before & after (touch on left & right of image on a touch screen device)

I moved to Nuke for compositing the screens on this film because of it's speed and powerful compositing tools. 

and DaVinci Resolve for color matching the outputs for editorial's screenings.

Move mouse to see before & after (touch on left & right of image on a touch screen device)

Posted on February 27, 2015 and filed under Film, Video Playback, Visual FX.