Posts tagged #Software Development

Useful Unity Asset Store Plugins for Playback

unityassetsbanner.jpg

One of the deciding factors for my choice to move to Unity for feature film playback work was the seemingly endless amounts of plugins available in the Unity Asset store.  Anytime I have a technical challenge on set, 9 times out of 10 I can find a tool on the asset store to solve it.

Here are 3 plugins on the asset store that stand out in my mind that I found were extremely helpful for building interactive playback UI graphics:


Cinema Director, Cinema Suite Inc

Coming from a motion graphic animation background using tools like Adobe After Effects,  I started looking for a timeline based animation tool.  Cinema Director ended up being my timeline tool of choice.  I love the fact that besides just animating standard values like position, rotation, and scale, you have the ability to call functions at points along the timeline as well.  

Food dispenser UI in the movie Passengers animated in Unity with Cinema Director

Because they have great video tutorials and documentation on using their API, I was able to fairly quickly integrate custom code which allowed me to jump around in a non-linear fashion.  I have yet to dive into their full Cinema Suite, but I can see where I could have use for Cinema Pro Cams and Cinema Themes in the near future.


Sprite Factory, Guavaman Enterprises

When jumping into using a game engine like Unity for motion graphics work, one of the big hurdles is playing animated image sequences.  Most playback graphic workflows involve rendering animations as video files or image sequences to be played interactively.  So, having to build sprite sheets and sprite animations is a foreign concept.  

Sprite Factory used to create sprite animation in the pod screens in the movie Passengers

Sprite Factory gives you an easy interface for importing and converting png sequences into sprite animations that you can easily control.  
 


Text Mesh Pro, Stephan Bouchard

So much of what playback graphics is about is helping to tell the story based on computer displayed story point text.  Whether that is a hacker getting an Access Denied screen or a teenager getting a break-up text from her boyfriend, text is very important in playback, and the quality of the text becomes really important when the director wants to shoot a close-up of the screen.  

Using Text Mesh Pro in animated screen graphics for the movie Passengers

If you want sharp, scalable text that not only looks amazing but also gives you the ability to add texture, outlines, and drop shadows, then Text Mesh Pro is the best tool for the job.  It also allows you the control on the individual character level.

Glyph editor for adjusting individual glyphs of a font

Glyph editor for adjusting individual glyphs of a font

I had a font given to me to use in a graphic that had a couple of characters that had bad spacing issues and I was able to fix it right in Text Mesh Pro.  

(Since the time of this post, TextMesh Pro was purchased by Unity Technologies and is now included with Unity Software)


These are just 3 of many tools on the Unity Asset Store that are invaluable to the work I do.  When comparing Unity to other engines, the Asset Store may be overlooked by some yet I believe it to be one of Unity’s biggest strengths. By uniting their user base into a cohesive community, an infinite amount and variety of talents can be accessed to accomplish just about anything you’d ever want to do no matter what niche of the market your needs fall into.

Click here to read my blog post on my work using Unity Software on the set of the movie Passengers 

Posted on February 7, 2017 and filed under Development, Motion Graphics.

Passengers

Several years ago, I created my own video playback software which has been heavily used on many of the movies I and my colleagues have worked on. While that software is still quite capable, as someone who has always been interested in new technologies, I’m always looking for new ways to push the boundaries on what we can do. Although I’ve been closely monitoring the progression of the Unity 3D game engine since version one, it wasn’t until fairly recently that the feature set advanced to the point of being a viable platform upon which I could create new playback software. Between the plugins available on the Unity store as well as custom ones I’ve built myself, I’ve been excited to be able to create ways of bringing the interactivity of on-set graphics to a whole new level.

cafeteria.png

When I was asked to do Sony’s futuristic sci-fi movie, Passengers, it seemed like a great opportunity to put what I’d been doing in Unity through its paces. While great actors, such as Chris Pratt, can make nearly anything they do believable even if all they’re interacting with is a green screen, it is my opinion that the more realistic and interactive we can make the technology they’re performing with on set, the easier it is for them to stay fully immersed in their character’s world. Which is why it was a big deal to me to overcome the challenges we were presented with on Passengers’ cafeteria set. We had to seamlessly tie together the actions of a tablet that was floating on glass with the 4K TV directly behind it. Traditionally, a scene like this would have been triggered remotely which requires choreography of the movements of the actor’s hands and the order in which they press the buttons on screen. It wouldn’t work well to have the actor pressing the left side of the screen if I’m triggering a button on the right.

fooddispenser.png

To eliminate the need for any graphics puppeteering or choreography, I decided to use OSC (Open Sound Control) to send network commands from the tablet to control the computer playing the graphic on the TV screen. That way when Chris would interact with the tablet on set the graphics on the larger monitor would react automatically and, instead of having to remember any kind of choreography, allow him to focus on his character’s predicament of having minimal access to the food dispenser.

In the hibernation bay, there were twelve pods containing four tablets per pod plus backups which meant more than fifty tablets that needed to display vital signs for whichever character was in the pod. 

Since the extras were constantly shifting between pods, we had to have a way to quickly select the right name and information for that passenger. This was done by building a database of passenger information that could be accessed via a drop-down list on each tablet which let us reconfigure the room in just a few minutes.

Since Passengers takes place in a high-tech spaceship, tablets were embedded in the walls throughout the corridors and rooms. Everything from door panels to elevator buttons and each room’s environmental controls were displayed on touch screens.

Because of how the set was constructed, many of the tablets were inaccessible once mounted in place. We’d start by loading the required content before the tablets were mounted but we also needed to have a way to make modifications through the device itself if changes were necessary. 

The beauty of using a game engine is that it renders graphics in real time. Whenever color, text, sizing, positioning or speed needed to be changed, it could be done quickly and remotely either by using a game controller or interactively on the tablet itself. This kept us from causing the kind of delays in the shooting schedule that would have resulted if we’d had to rip devices out of the walls every time a change was made.

I learned a lot about Unity’s capabilities on this movie and I’m excited to continue exploring the boundaries of using game engines in ways they weren’t necessarily built for. This experience has allowed me to refine and expand upon what I thought was possible and I’m excited to use even more advanced versions of my playback software on future projects. That said, without amazing graphics to play back, the best software in the world still won’t get the job done. Chris Kieffer, Coplin LeBleu and Sal Palacios at Warner Bros. produce graphics that are second to none and I’m always grateful for the times I get to work alongside them. 

To read more, here is a link to my article in Local 695 Production Sound & Video Magazine.

Passengers photos courtesy of Columbia Pictures

Posted on January 7, 2017 and filed under Film, Video Playback.

Interstellar

When working on Interstellar, our team was brought in early on, in pre-production, to work with the art department on designs for the screens in the various ships. We were tasked with the goal of making a very utilitarian, functional, NASA style design. After much research into actual space shuttle control screens, we were able to use the real-life examples as a starting point.

Graphic designers Chris Kieffer, Coplin LeBleu, and Sal Palacios did a great job finding a visual middle ground between what would be futuristic space travel to the audience yet antiquated to the crew of the Endurance.

Although I was also involved in the design process, wrangling all of the various content and transforming it into interactive media that could be controlled seamlessly whether it appeared on monitors, tablets, or laptop computers was my specialty.

This film called for me to develop special software to remotely control an iPad mini which was built into a prop featuring fake buttons that needed to respond to Anne Hathaway's interaction.

Both the TARS and CASE robots each featured two additional iPad minis that also needed to be controlled remotely via this same software. Although I have since created more flexible tools for controlling devices, this was the first time we were able to utilize other iOS products to remotely trigger the iPads across a wireless network.

The crew of Interstellar was among the most talented group of people I've ever had the pleasure of working with and I feel very fortunate to have been a part of making this film.

Posted on February 5, 2016 and filed under Film, Video Playback.

The Avengers

Because of the scale of The Avengers, Rick Whitfield, Jim Sevin, Tim Gregoire, and I were brought on to engineer the hundreds of playback screens.  Cantina Creative created the animated computer graphics that we needed to playback on the various set throughout the film.

The bridge set alone had 130 monitors.  We needed to develop a way to be able route any of the 30 computer feeds out to the monitors on set.  

Using 4 Blackmagic Design 40x40 3G videohubs we could organize and control what was on any given monitor in the shot.

Our video playback booth built under the carrier bridge set

Our video playback booth built under the carrier bridge set

In order to speed up our ability to manipulate the layout of the graphics on set as well as being able to put green screens on any monitor quickly for post VFX, it was clear I needed to write custom software to control the videohub routers.  

Since the routers could accept Telnet commands over a wired network, I developed a router controlling application in Xojo (formerly Real Studio), that we could setup layouts of all the monitors on set at once and save it to a preset.  This allowed us to switch to saved preset for a given scene with a single button click.

As well as routing the computer screens, we also need to control the timing of the playback graphics for some scenes.  For example, in the scene where the mind controlled Hawkeye shoots the computer virus arrow into the bridge computer, I built a timed delay into our quicktime playback software to create a computer outage ripple effect.

Since the helmsman and map controls at the front of the Bridge set were shapes that could not be actual live playback screens.  It was decided that instead of using green on those surfaces that they should be designed as a static backlit display.  I was approached by the art department about designing the graphics for practical on set pieces.  I goal was to make it blend in and fit with the graphics designed by Cantina Creative that we would be playing back on set.

On the helm display I used Adobe After Effects for the final print image.  I ended up having better control over the shape and curvature.  

As well as making it easier for the animators in post to make the final replaced shot in the film from the composition I created for print.

We also helped out at the Comic-Con Avengers booth.

Setting up a sample of the bridge control screens from the set on the Avengers stage.

Posted on November 12, 2014 and filed under Film, Video Playback, Development.