Injustice 2’s Facial Animation Is So Good It’s Uncanny

By | News | No Comments

Article excerpted from – Injustice 2’s Facial Animation Is So Good It’s Uncanny

Authored by Mike Fahey

Injustice 2’s story mode features a strong script voiced by top voice talent, but the real stunner is the way the in-game faces move when acting out the dialogue. Can you make out what Harley Quinn is saying in the animated GIF above? Hint: she’s talking about Gorilla Grodd.

Each new snippet of video shown during Injustice 2’s relentless media barrage leading up to today’s launch was accompanied by people arguing over the character’s faces. Some loved the more realistic look the sequel was going for. Others found it off-putting or weird. Comparing shots of Injustice: Gods Among Us Harley Quinn . . .

. . . to Harleen from Injustice 2 . . .

. . . I can see where some might be worried about the new direction. I was a little concerned myself, but those concerns quickly faded as I played, replaced with a deep admiration for what the animators accomplished.

Here’s a scene between Batman, Robin (Damien Wayne) and Wonder Woman from early on in story mode.

See Robin’s smirk? The hint of sadness and concern in Wonder Woman’s eyes? It’s not just animation, it’s acting.

And these aren’t just the cutscenes, either. Though the lighting changes between scripted scene and battle can be a little jarring, the same facial animations present in the cinematics carry over to gameplay. Check out this clip of Catwoman, taken from the customization screen.

Just as bad animation can severely cripple a video game’s ability to tell a compelling story, good animation draws the viewer into the narrative, adding emotion and nuance. Injustice 2 has very, very good animation.

Now about that hair . . .

Faceware Brings DC Heroes and Villains to Life in Injustice 2

By | News | No Comments

NetherRealm Studios’ Injustice 2, the action-packed sequel to 2013’s Injustice: Gods Among Us, releases today.

The game is already earning accolades for its top notch facial animation, created with Faceware’s award-winning facial motion capture hardware and software solutions including the Faceware ProHD Headcam System and Analyzer 3.0 and Retargeter 5.0 software.

As gaming industry media outlet Kokatu’s Senior Editor notes:

Just as bad animation can severely cripple a video game’s ability to tell a compelling story, good animation draws the viewer into the narrative, adding emotion and nuance. Injustice 2 has very, very good animation.

Using Faceware’s hardware and software solutions, NetherRealm Studios was able to quickly capture high quality facial motion capture data on set and use the resulting data in Analyzer 3.0 and Retargeter 5.0 to create the vaunted facial animation for Injustice 2.

VP of Business Development Peter Busch explains:

Our facial recognition technology is able to capture a great deal of nuances from the brows, to the eyes, nose, cheeks, and mouth by tracking textures and features, rather limiting the quality of the data by tracking points of the face, in order to create highly detailed output. This enables our software to handle anything from stylized to photo-real characters. Together with other features like timecode support, capture, trimming, pose libraries, and a technological core drawing from over 10,000 minutes of facial performance data and users in almost 50 countries, Analyzer and Retargeter are the most powerful tools available for facial animation. Our decades of facial motion capture technology research and experience has created a streamlined and flexible workflow – one that enables teams to scale and create high-quality, high-volume animation, all while staying within budget and maximizing artists’ creative control.

For more information, visit:

Faceware Helps Barbie Reach 2 Million Subscribers and 23 Million Views

By | News | No Comments

Barbie’s vlog hits new milestone

Mattel’s Barbie vlog, created by CounterPunch together with House of Moves and Faceware Technologies, reached a major milestone today with 2 million subscribers and 23 million views. The vlog has reached this impressive number in less than 2 years but even more impressive is that each animated episode is created with less than 5 days of production.

How you might ask? Check out the Barbie vlog case study to find out how we helped CounterPunch and House of Moves bring Barbie to life with rapid production.

Faceware’s Peter Busch To Key Note Largest VR & AR Conference in Australia and New Zealand

By | News | No Comments

Faceware VP of Business Development, Peter Busch will be delivering a key note address for the Magnify Virtual Reality and Augmented Reality Business Summit to be held on May 8, 2017.

Over 30 recognized VR and AR experts from around the world will discuss cutting-edge research and best practices for VR and AR. The event will bring together top industry professionals through panels, round table discussions, interactive workshops, and demonstrations. Held annually, the summit will discuss a variety of topics in finance, film, television, tourism, gaming, investment, and China.

For more information about the summit, please visit:

Faceware Technology Used in ABC’s New Sitcom Imaginary Mary

By | News | No Comments

Faceware’s Creation Suite Pro Complete System Provides Realism for Mary’s Spunk!

ABC’s live action/CGI hybrid stars Jenna Elfman as Alice, a fiercely independent career woman whose life is turned upside-down when she meets the love of her life–Ben, a divorced father with three kids, played by Stephen Schneider.

The real fun begins when Alice’s imaginary friend from her childhood returns as sassy and sharp-tongued Mary. Zoic Studios created the animated character voiced by Saturday Night Live alum’s Rachel Dratch, using Faceware’s Creation Suite Pro Complete System which includes the Faceware ProHD Headcam, Analyzer 3.0 Studio Plus, and Retargeter 5.0 Studio Plus software.

Using this complete System, Zoic was able to quickly shoot the facial motion capture completely marker-less and create the quality facial animation used for the live action ABC network show.

VP of Business Development at Faceware, Peter Busch, notes that unlike other motion capture solutions, “rather than limiting the quality of the data by tracking points on the face, our technology tracks textures and features to create highly detailed output. This enables our software to handle anything from stylized to photo-real characters.   Markers inherently can also cause many technical issues during performance capture—plus, if you think what it takes to create convincing CG animation for the face—two key elements are the eyes and it’s the quality of lip sync—you can’t put markers on the eyes or on the inner lips. It’s great to be part of an ambitious project that harnesses these unique capabilities to create an interactive CG character for Prime Time TV.”

For more information about the development of Mary, visit:

To watch the Pilot or catch up on Season 1 episodes, visit:

Faceware Adds Realism to True Survival Horror in Resident Evil 7

By | News | No Comments

re7_screen8Resident Evil 7 exploded into the gaming world today and Faceware is thrilled to be a part of this ground-breaking gameplay. The geniuses at CAPCOM combine fear, combat, exploration, and item management in a masterfully executed Isolated View experience in a world created by the all-new RE Engine.

Check out more at, if you dare!


Motion CaptVRe

By | News | No Comments

By Jem Alexander

With recent improvements to hardware and software, plus an increasing emphasis on real-time, interactive performance capture, we could be on the precipice of a golden age for motion capture technology. Jem Alexander investigates recent progressions in mo-cap technologies

Motion capture, and the way it is used in game development, is improving rapidly. No longer used solely by animators to record and store an actor’s performance, the technology is expanding into new areas. Those working with it on a daily basis are excited to see where this might lead.

Technology’s inevitable march forward means that motion capture can now occur in realtime, with an actor’s movements being instantly reflected in a game. Not only does this benefit animators by streamlining their process, but it also opens doors to other applications, like virtual reality. The HTC Vive, Oculus Rift and PlayStation VR all take advantage of motion capture technology to allow players to interact with virtual worlds. Not as advanced as that used in an animation studio, perhaps, but with time this can only improve.

“The biggest advance in mo-cap, in my opinion, is linked to the VR push,” says Alexandre Pechev, CEO of motion capture middleware provider IKinema.

“With the advances in VR hardware, motion capture technologies have moved to our living rooms and offices. Mo-cap will inevitably become part of our everyday life.”

Motion capture studio Audiomotion’s managing director, Brian Mitchell, agrees. “The fact we can stream live data through to game engines has had a massive effect. Matched with VR, this means developers can really let loose with their creativity,” he says.

It seems that only recently have all the many individual steps in motion capture technology culminated in a great leap forward for the industry.

“I think the biggest development this year has been the jump in combined technologies and approaches in realtime full performance capture,” says Derek Potter, head of product management at Vicon Motion Systems. “There’s been steady progress over the past five years in the area of realtime full performance capture, however what we’ve seen previously is progress on single fronts.

“What we find really exciting about the past year is seeing different developments coming together. It feels like this is moving these types of captures from being investigative to being more fully realised and production ready.”

It’s not just the games industry that is enjoying these developments. Motion capture is the same wherever it is used and everyone is learning from one another. “Different industries use the same game engines for realtime rendering, the same mo-cap hardware and the same solving and retargeting technologies for producing the final product – animation in realtime,” says IKinema’s Pechev. “We are already at a point where the entertainment sector is sharing hardware, technologies, approaches and tools.”

Dr. Colin Urquhart, CEO of Dimensional Imaging, sees this as a good thing for everyone. Especially since some areas of the entertainment industry have been at it a little longer than us. “The use of helmet mounted camera systems for full performance capture was pioneered on movie projects such as Avatar and Planet of the Apes, but is now becoming widespread for use on video game projects,” he says. “People see how effective this technology is in movies and expect the same effect in a video game.”

Full performance capture like this, which simultaneously records body and facial movements, leads to much more realistic actions and expressions. Something that can really affect your immersion in a world or your feelings for a character. Peter Busch, VP of business development at Faceware Technologies, a motion capture company that focuses on facial animation, says that “characters in today’s games are already pushing the realism envelope when it comes to facial performances, but that will only increase with time. Look for more realistic facial movement and in particular, eye movement, in the games of tomorrow“.

“It’s one thing to watch an animated character in a game,” Busch continues. “It’s quite another to interact with one. Today, we’re able to interact with characters animated in realtime via live performances or kiosks at theme parks. It’s rudimentary, but it’s effective.

“Tomorrow, we’ll be able to interact with player-driven characters or AI-driven avatars, in game, in realtime. Imagine saying something to a character in a game and having them respond to you as they would in real life. This will change the face of games.”

Virtual reality will be a huge beneficiary of these improvements to facial animation, as developers scramble their way out of the uncanny valley. Classic NPCs feel significantly more like dead-eyed mannequins within a VR environment and improvements in this area could go a long way towards truer immersion and deeper connections with characters and worlds.

“The key to an engaging experience, especially in a new medium such as VR, is connecting users with a powerful story and characters by drawing upon the emotional, character-driven, and nuanced performance from the faces of the actors,” says Busch. “[With today’s technology] studios are able to capture the entire facial performance, including micro expressions, and display those subtleties in convincing ways.”

Vicon’s Derek Potter is in agreementand suggests that improvements in motion capture could mitigate the effect of the uncanny valley, if not remove it completely. “Nothing immerses like the eyes,” he says. “The one thing that each new generation of console has provided is more power. More power means better, more realistic visuals. Motion capture provides the ability to transpose ‘truer’ movements onto characters.

“The mind is a brilliant machine and one of the things that it does amazingly well is to let us know when something ‘doesn’t look quite right’. This lessens the immersiveness of the gaming experience. What excites us about the next generation of consoles is the increasing ability to render realistic graphics, combined with the motion capture industry’s progress in capturing finer and more realistic movements to let us all sink a little deeper into the game. I think this is amazing, both as someone working in motion capture and as a gamer.”

As advances continue to be made to motion capture technology and software, we’re seeing a drop in price for mo-cap solutions. “I think it’s easier for developers to incorporate motion capture into their projects than ever,” says Dimensional Imaging’s Urquhart.

“Mo-cap used to be a tool used exclusively by big studios or developers on big budget projects. This is not the case anymore. There are many tools on the market for capturing an individual’s face and body performance. Mo-cap tech doesn’t have to be a premium product.”

This accessibility helpfully comes at a time when even small indie developers might be looking into motion capture solutions, thanks to the recent release of consumer virtual reality devices. “VR controllers act as a realtime mo-cap source of data,” says IKinema’s Pechev. “Using this to see your virtual body reacting to your real movements changes completely the quality of immersion.”

“Using mo-cap to see your own avatar allows you to actually feel the environment as well as see it,” says Audiomotion’s Mitchell. “It really does turn your stomach when you see your own foot moving closer to the cliff edge.”

With motion capture now more powerful, accessible and interactive than ever, where does the industry go from here?

“The demand for mo-cap is definitely going to increase over the next few years and games projects will require better and better acting talent and direction to meet  this demand,” says Dimensional Imaging’s Urquhart.

“Improving on the realism means increasing the fidelity of motion capture, particularly of facial motion capture. Using techniques such as surface capture instead of marker based or facial-feature based approaches can help with this.”

Faceware’s Peter Busch has plans in mind to solve the issues inherent in facial motion capture, particularly for VR. “The next frontier is creating even more realistic and immersive experiences across the board, including new experiences like realtime user-driven VR characters that are realistic in every aspect, right down to the user’s facial expressions,” he says.

“Current cameras are insufficient for capturing true social VR. There is no way to capture a VR user’s entire face while playing VR. Our interactive team is actively developing hardware and software solutions that we’re planning to launch this coming year.”

Vicon’s Derek Potter believes there’s still plenty of work for providers to do to make mo-cap the best it can be. “I think there are three way that providers can help in continuing to push motion and performance capture forward,” he says. “[Companies] like Vicon need to continue to do what we’ve been doing steadily each year, which is to continue to improve the fundamentals of the motion data provided. The accuracy, reliability, overall quality and speed of the data are all vital.

“Secondly, as motion capture matures as a technology, we need to see it not only as its own entity but also as part of a bigger machine. We need to continue our efforts to integrate and provide more open access to the data provided by the system. The third thing we need to do is to listen to some bright lights in the field who have been pushing technics forward recently. The technology needs to be pliable enough to fit the new and evolving needs that performance capture demands.“

Meanwhile, IKinema’s Alexandre Pechev won’t be happy until motion capture technology is perfected. “[I want] better and simpler (ideally non- invasive) types of motion capture combined with realtime solvers that automatically improve issues and deliver perfect animation,” he says, ending with some blue sky thinking that in the past would have been a better fit for a sci-fi novel than a game development magazine.

“My dream is to see a mo-cap system that reads brainwaves to deliver the muscle movements of all bones. Maybe one day…”

Announcing New Faceware Hardware

By | News, Product Update | No Comments

New Mark 3.2 Camera for Faceware ProHD Headcam Systems

We have taken the enhancements made to the Mark 3 camera and refined them to make the camera stronger and easier to adjust all with an updated design.

  • A new cage protects the camera and strengthens  lateral pan adjustments.  Steel receiving threads and a larger bolt prevent stripping out the camera top.
  • Now you can tighten down pan adjustments as hard as you want without fear of damaging the camera.
  • The bar clamp has been spaced to ensure you can clamp down on single arms and hoops securely for years to come.
  • And finally, the camera back arch has also been redesigned to allow easier access to the camera controls while maintaining strength and improving overall aesthetic.

The new Mark 3.2 camera is available for stand-alone purchase or can be purchased as part of any new Faceware ProHD Headcam System package.

New Go Pro Headcam Systems for Hero 5

We’re excited to announce the launch of Faceware GoPro Headcam System for the GoPro Hero 5.

  • This GoPro Headcam System features custom housing for the larger GoPro Hero 5 camera
  • Unparalleled, lightweight design for maximum comfort during performance capture.

The new Go Pro Headcam System for the Hero 5 is available for stand-alone purchase or can be purchased as part of any new Faceware Indie Headcam System package.

Our experienced team-members can answer any questions you have about this new hardware and how they integrate with our complete facial motion capture solutions. Send us an email via our Contact Us page.

Faceware’s Technologies Resonate in Broadcast Beat’s Feature Article

By | News | No Comments

Broadcast Beat Faceware is honored to be featured in November’s Broadcast Beat magazine. The article Two Birds, One Stone, highlights the unparalleled success Los Angeles-based sound studio, Formosa Interactive, has found with Faceware’s technologies powering their voice recording sessions.

Formosa Interactive needed a seamless facial and audio capture solution that didn’t inhibit the organic performances of the talent. With Faceware’s ProHD Headcam hardware, they found the simultaneous voice and face capture to be swift, simple, and an adaptable process – one that enables their sound engineers and actors alike to do their job without any of the hassle.

Read the online article HERE
bb_425 to get an in-depth look at how integrating Faceware’s ProHD Headcam hardware into their pipeline allowed Formosa Interactive to deliver simultaneous facial and audio capture without breaking the rhythm of performance or production. As an added bonus, on page 41, Christopher Jones, Faceware performance capture supervisor, particpates in a Q&A session describing the process to set up Faceware’s tools in a voice-booth. You don’t want to miss this!

About Broadcast Beat

Broadcast Beat Magazine covers technologies in the broadcast, motion picture and post production industries. This, the 2016 NAB NY Show Edition, covers broadcast, motion picture and post-production technologies being released at the 2016 NAB NY Show in New York. 


Faceware’s Suite of Technology Amps Up Call of Duty: Infinite Warfare

By | News | No Comments

Infinite_Warfare_smGaming Community Get Ready!

Today’s the day – Call of Duty: Infinite Warfare release!

Facial Motion Capture Awesomeness

Faceware hardware and software was used for facial motion capture and facial animation for new scenes and characters. How will this play into the storyline? Players will step into the shoes of Captain Reyes, a Tier 1 Special Operations pilot who helms the Retribution, one of Earth’s last remaining warships. Reyes’ story will see him/her lead the “remnants of coalition forces against a relentless enemy.” The campaign will take players across the solar system, spanning locations including Earth and beyond in a “grand scale war.”


Technology Lends Credibility

Faceware ProHD Headcam Systems and Analyzer and Retargeter software were used to mocap actor Kit Harrington and MMA champ Connor McGregor. Want to learn more about the mocap behind Call of Duty? Here’s Brian Bloom talking about the intricate motion capture in a behind the scene reveal.


Looking for more? Watch the official trailer here and let us know what you think!