CASE STUDY

Client: Deakin Motion.Lab

NeedPutting a face on Deakin Motion.Lab’s virtual production pipeline

Author: Andrew Hayward

Excerpt:

Virtual production is on the rise, and Melbourne’s Deakin Motion.Lab is at the forefront of that wave, translating live motion and facial capture into real-time CGI animation that can be molded and tweaked in the moment. Deakin’s disruptive work is revolutionizing the way that animation is produced, and Faceware Live technology helps add the needed nuance to bring it all together…

For some creators, there’s a real disconnect between what they see live on the capture stage and imagining how it will look in the finished product. Deakin Motion.Lab, a research and development hub based at Melbourne’s Deakin University, recognized the issue and set to work creating Alchemy, its powerful virtual production pipeline.

“We started working in motion capture years and years ago, and we kept running into that problem: we’d have directors in that aren’t used to working in motion capture, so when they were directing a performer, they weren’t easily able to visualize what that would look like later on,” explains Peter Divers, Deakin Motion.Lab’s virtual production supervisor.

The virtual production pipeline instantly translates motion and facial capture data into on-screen characters, letting directors immediately see how a CGI human or creature will react—and adjust accordingly. If a movement doesn’t look right, they can alter it in a heartbeat. If a joke just doesn’t land, they can amend it. Virtual production eliminates the frustration of capturing something only to find out later that it doesn’t really work how you wanted it to.

Deakin’s Alchemy pipeline has gotten so good that many of their clients skip post-production entirely: they’re thrilled with the live, in-the-moment quality of the animation, and find an ideal balance of quality to cost as a result. And part of that success is due to the help of Faceware Live, which delivers the emotion and believability needed to really sell the results.

Perfecting the Pipeline

In the early days of the Alchemy pipeline, before Faceware Live was introduced and implemented, Divers says that the lack of facial capture was an obvious deficiency.

“All of our characters had that dead, canned sort of face, because live facial—we couldn’t get it, and nobody was doing it at the time. That’s when we found out about Faceware, and it worked quite easily with them. They released the Live server, and we got on really early with that,” says Divers. “Using software like Faceware, we are able to really stretch the limits of what facial capture can be used for.”

“The software keeps getting better, and better, and better. The quality of the face animation we’re getting out of it now live on stage is fantastic,” he adds. “We’ve had a few companies that are looking to get completely away from post-production, and go straight to short-form YouTube content, right from the motion capture stage.”

Dr. Jordan Beth Vincent, research fellow at Deakin Motion.Lab, affirms that “high-quality work is expensive,” but that Faceware technology and their pipeline allows clients to reach a high bar of quality without investing in extensive post-production or clean-up, if they choose. “The benefits to companies, if we can skip post-production, are huge,” adds Divers. “That lowers the costs.

Deakin streams data from Faceware Live and their OptiTrack optical motion capture setup into the Unity game engine, which can display the results in real-time right there on the stage. They’ve found Faceware Live’s markerless facial capture technology to be incredibly easy to use, and noted that it takes far less setup time compared to competing options. That is especially handy during a long, busy shoot, or when they’re working with new talent.

“A lot of other software now, you have to not only set up an individual person, but have to do a whole session setup for the specific character you want to retarget, as well,” explains Divers. “We don’t always have the option to do prep work before the shoot day. In that case, Faceware is amazing. We put a facecam on somebody, and straight away, it calibrates to their face.”

Heroic Facial Capture

Deakin Motion.Lab recently put their pipeline to the test for Minibeast Heroes, a six-episode series for the Australian Broadcast Company’s ABC Education program. The entire production lasted just a few months, with most of that spent in pre-production before just one week of shooting at their studio. Science journalist and presenter Carl Smith donned the mocap gear for the shoot, as his character shrinks down to explore scenarios featuring tiny bugs.

“We used Faceware in the live capture, and what it particularly allowed director Stefan Wernik [of Armchair Productions] to do was get a really great approximation of what the character was going to look like in the final scene,” Vincent recalls. “Of course, the facial capture was a key part of that. Just looking at the performer, but also getting a sense of how the character is going to react and whether that’s going to tell the right story.”

The benefits are immense. A virtual production pipeline allows creators to have more control in the process, and provides the ability to make meaningful changes during the capture session, rather than try to rework finished footage at the very end of traditional production.

“There was a big problem where the creativity wasn’t in the right spot for kids’ animation. Directors on these cartoon shows needed to decide at script level, they would storyboard frame by frame, and then it would basically get outsourced—and they wouldn’t get another say on it until it was basically finished in the editing room,” Divers suggests. “This sort of pipeline really affected that in a way that the director sees it all on the stage. The director is making those choices, but also gets the option to play around with it a bit, direct the performer, and see how that’s going to change. We’ve seen stories dramatically change on the motion capture stage.”

In Vincent’s view, this kind of approach also creates a new kind of creative connection between the voice and motion capture performer, the director, and the character they’re shaping together in the moment. It’s just one of many significant benefits of virtual production.

“It takes the performers from just doing the ADR in a sound studio somewhere on the other side of the world to actually being a part of the collaborative process. It really becomes a collaboration: you’ve got the human performer who’s being captured, but you also have their avatar on screen and what’s happening with that, and then the director who is part of that,” says Vincent. “It becomes an ecosystem, and we think that this ecosystem approach for creating content is actually the most exciting way to get better stories told.”

Changing the Game

Deakin Motion.Lab’s talents are in hot demand right now. In the past, they’ve provided motion capture for video games, advertisements, and the feature film I, Frankenstein, and they now engage with industry on projects in other diverse areas, like health and engineering. Deakin’s goal is to push to new frontiers in the realm of research and development, and then work with industry partners to find applications for that technology—and then keep pushing further.

“We are really interested in how we can develop tools for making content in innovative ways, and we’re liaising directly with industry,” says Vincent. “Working with industry then gives us the confirmation that we’re moving in the right direction with what we’re creating, and also helps us understand what else is needed. Where else can we take our work?”

At SIGGRAPH 2018, the Deakin team observed a surge of interest in virtual production pipelines, and they feel invigorated that their pioneering work is catching on.

“It’s really exciting. We want to continue working with interesting industry partners who have stories to tell, and we can really provide a platform for those stories to be told in a way that’s faster, better, and cheaper,” says Vincent. “We’re developing our own content now, thinking about the kind of stories that we want to tell, and I think we’re just growing and growing.”

Deakin Motion Lab recently unveiled The Adventures of Auntie Ada, a STEM-focused animated show concept about a young girl and her scientist aunt, and it’s yet another example of how their Alchemy pipeline can produce quality work with speedy turnarounds. For Deakin, it’s all about pushing the envelope of what’s possible with virtual production, and Faceware Live is a critical tool in their arsenal.

For us, the use of facial capture in this virtual production work is really about disruption. We’re interested in seeing what can be done differently and better,” Vincent asserts. “We want to stay at the forefront of what’s possible in the live facial area. We want to push the boundaries and see how far we can take it, and for us, that’s where Faceware Live fits into our pipeline.