Passengers

Several years ago, I created my own video playback software which has been heavily used on many of the movies I and my colleagues have worked on. While that software is still quite capable, as someone who has always been interested in new technologies, I’m always looking for new ways to push the boundaries on what we can do. Although I’ve been closely monitoring the progression of the Unity game engine since version one, it wasn’t until fairly recently that the feature set advanced to the point of being a viable platform upon which I could create new playback software. Between the plugins available on the Unity store as well as custom ones I’ve built myself, I’ve been excited to be able to create ways of bringing the interactivity of on-set graphics to a whole new level.

cafeteria.png

When I was asked to do Sony’s futuristic sci-fi movie, Passengers, it seemed like a great opportunity to put what I’d been doing in Unity through its paces. While great actors, such as Chris Pratt, can make nearly anything they do believable even if all they’re interacting with is a green screen, it is my opinion that the more realistic and interactive we can make the technology they’re performing with on set, the easier it is for them to stay fully immersed in their character’s world. Which is why it was a big deal to me to overcome the challenges we were presented with on Passengers’ cafeteria set. We had to seamlessly tie together the actions of a tablet that was floating on glass with the 4K TV directly behind it. Traditionally, a scene like this would have been triggered remotely which requires choreography of the movements of the actor’s hands and the order in which they press the buttons on screen. It wouldn’t work well to have the actor pressing the left side of the screen if I’m triggering a button on the right.

fooddispenser.png

To eliminate the need for any graphics puppeteering or choreography, I decided to use OSC (Open Sound Control) to send network commands from the tablet to control the computer playing the graphic on the TV screen. That way when Chris would interact with the tablet on set the graphics on the larger monitor would react automatically and, instead of having to remember any kind of choreography, allow him to focus on his character’s predicament of having minimal access to the food dispenser.

In the hibernation bay, there were twelve pods containing four tablets per pod plus backups which meant more than fifty tablets that needed to display vital signs for whichever character was in the pod. 

Since the extras were constantly shifting between pods, we had to have a way to quickly select the right name and information for that passenger. This was done by building a database of passenger information that could be accessed via a drop-down list on each tablet which let us reconfigure the room in just a few minutes.

Since Passengers takes place in a high-tech spaceship, tablets were embedded in the walls throughout the corridors and rooms. Everything from door panels to elevator buttons and each room’s environmental controls were displayed on touch screens.

Because of how the set was constructed, many of the tablets were inaccessible once mounted in place. We’d start by loading the required content before the tablets were mounted but we also needed to have a way to make modifications through the device itself if changes were necessary. 

The beauty of using a game engine is that it renders graphics in real time. Whenever color, text, sizing, positioning or speed needed to be changed, it could be done quickly and remotely either by using a game controller or interactively on the tablet itself. This kept us from causing the kind of delays in the shooting schedule that would have resulted if we’d had to rip devices out of the walls every time a change was made.

I learned a lot about Unity’s capabilities on this movie and I’m excited to continue exploring the boundaries of using game engines in ways they weren’t necessarily built for. This experience has allowed me to refine and expand upon what I thought was possible and I’m excited to use even more advanced versions of my playback software on future projects. That said, without amazing graphics to play back, the best software in the world still won’t get the job done. Chris Kieffer, Coplin LeBleu and Sal Palacios at Warner Bros. produce graphics that are second to none and I’m always grateful for the times I get to work alongside them. 

To read more, here is a link to my article in Local 695 Production Sound & Video Magazine.

Passengers photos courtesy of Columbia Pictures

Posted on January 7, 2017 and filed under Film, Video Playback.

Passengers Official Trailer

Here is the official trailer for Passengers. The feature I was working on in Georgia last year.  It is always a pleasure working with Rick Whitfield (Video Playback Supervisor) on shows where we get the opportunity to push the envelope with technology.  First project where I was able to use the prototype of new playback software I developed in the Unity game engine.  The talented graphics team at Warner Bros. Production Sound & Video Services: Chris Kieffer, Coplin LeBleu, & Sal Palacios created amazing content as well as building us a great library of elements that made it possible to produce graphics that could be rendered on set in real-time.  Thanks again guys, you all rock!

Posted on September 20, 2016 .

Interstellar

When working on Interstellar, our team was brought in early on, in pre-production, to work with the art department on designs for the screens in the various ships. We were tasked with the goal of making a very utilitarian, functional, NASA style design. After much research into actual space shuttle control screens, we were able to use the real-life examples as a starting point.

Graphic designers Chris Kieffer, Coplin LeBleu, and Sal Palacios did a great job finding a visual middle ground between what would be futuristic space travel to the audience yet antiquated to the crew of the Endurance.

Although I was also involved in the design process, wrangling all of the various content and transforming it into interactive media that could be controlled seamlessly whether it appeared on monitors, tablets, or laptop computers was my specialty.

This film called for me to develop special software to remotely control an iPad mini which was built into a prop featuring fake buttons that needed to respond to Anne Hathaway's interaction.

Both the TARS and CASE robots each featured two additional iPad minis that also needed to be controlled remotely via this same software. Although I have since created more flexible tools for controlling devices, this was the first time we were able to utilize other iOS products to remotely trigger the iPads across a wireless network.

The crew of Interstellar was among the most talented group of people I've ever had the pleasure of working with and I feel very fortunate to have been a part of making this film.

Posted on February 5, 2016 and filed under Film, Video Playback.

Welcome!

Welcome to my portfolio and blog! I’m currently deep in the trenches working on a feature film for Sony here at Pinewood Studios in Atlanta, GA. It’s been a busy 5 months and I can’t say I’ll be sad to be headed home to L.A. soon but I’ve had a thrilling time getting to work on new ways of using the Unity gaming platform to program various interactive and triggerable displays. I’m looking forward to discussing a bit of that as well as sharing some of the other knowledge, tips, and cool tech I have come across / developed in the entertainment industry.

In my 15+ years of on-set interactive computer graphics, I have had to come up with creative solutions to last minute changes and/or requests – sometimes in as little as 15 minutes before a shot.  If anything I’ve learned over the years or any of my whacky outside-the-box ideas that I post here in the coming months helps anyone even a little, then this website was all worth it. 

 

Posted on December 22, 2015 .