In the upcoming Universal Pictures feature film by legendary writer-director and visual effects innovator Robert Zemeckis, Welcome to Marwen blends live-action and animation created by Atomic Fiction and powered by Faceware to tell the unique story of an artist who finds inspiration and courage in an unlikely place. Atomic Fiction and Framestore are producing the animation and visual effects under the direction of renowned Atomic Fiction Co-Founder and VFX Supervisor Kevin Baillie, responsible for the powerful visual effects behind two Star Trek movies, two Transformers franchise installments, and the award-winning effects for several of Robert Zemeckis’ films such as Flight, The Walk, and Allied.
Starring Hollywood A-listers Steve Carrell, Leslie Mann, Diane Kruger, Janelle Monae, and Eliza Gonzalez, Welcome to Marwen tells the miraculous true story of one broken man’s fight as he discovers how artistic imagination can restore the human spirit. When a devastating attack shatters Mark Hogancamp and wipes away all memories, no one expected recovery. Putting together pieces from his old and new life, Mark meticulously creates a wondrous town where he can heal and be heroic. As he builds an astonishing art installation — a testament to the most powerful women he knows — through his fantasy world, he draws strength to triumph in the real one. Welcome to Marwen is scheduled to release this Holiday Season.
Faceware is excited to be a part of a new game release, A Way Out, with EA and Hazelight. Check out the trailer below and the informative Concept to Completion article from EA.
CONCEPT TO COMPLETION: CREATING LEO AND VINCENT
Writer and Director Josef Fares discusses creating A Way Out’s main characters, Leo and Vincent, mo-capping himself, in-house stunt coordination and developing a prison genre video game.
Hi Josef, can you tell us about your work on A Way Out?
It’s very simple really. I’m the Writer and Director of A Way Out, an idea that came about shortly after Brothers – A Tale of Two Sons, and something which I’ve been working on for about three years now.
The game is set in the early 1970’s, does this make character research and development easier?
The idea of setting the game in the seventies, was mainly because that era suited the games storyline. Also, in that period there were no cell phones and technology were not as advanced as they are today.
At the same time though, it was important not to make the game look too seventies, as that could take away from the actual gameplay. Players can see that the game is set in the seventies, but it’s not too obvious or in your face that it is the seventies.
How much time, passion and energy does Hazelight put into character development?
What can I say. Many people know that I live, breathe and eat A Way Out. I really care and believe in it, because I feel we’re doing something very special with this game. It takes a lot of hard work, not only with character design but also with the development of the game itself.
I can’t work on a project if I don’t feel passionately enough about it.
Is developing a game for the prison genre more fun than other genres like Sci-Fi?
I think players in general want to feel free and be able to do what they want. With the prison genre, it takes away some of that freedom. It makes you feel like you’re locked up and contained in a dangerous, but exciting environment.
As for sci-fi, I guess you can go crazy with ideas without really being questioned by players too much, because games of that genre are typically set in the future.
Personally though, I’d say that it doesn’t really matter what the genre of a game is as long as you feel passion for it in your heart. For me personally, I can’t work on a project if I don’t feel passionately enough about it.
How many design iterations did Leo and Vincent go through?
There were a lot of design iterations for both Leo and Vincent, lots of reworking and trying to find different types of inspiration for them.
Looking back at how both characters first started out to where they are now, they’re very different. There were experiments with haircuts, with and without beards and a number of different looks which all formed part of our process.
In Leo’s case (after about a year of development), we decided to base his face on the face of my brother, the actor Fares Fares. What I like about my brother’s face, is it’s very cinematic with striking characteristics and why we wanted to use him.
From a technical aspect, it was a case of considering the right types of animations to use, how they would express themselves and their also individual movements like walk cycles.
How did you create a contrast between Leo and Vincent?
Well, with Leo and Vincent, they both have very different personalities. They react in their own way to certain situations, interact with the world around them differently and have separate indignations.
Leo is a very straight up guy with a short temper, while Vincent is more calm, cool and relaxed and thinks twice before he does something.
Clothing also gave us choices to help express their individual personalities, although the prison aspect of the game was limiting, because in that environment clothing is very uniform like.
How does doing all of your own mo-capping and stunts aide production?
A lot actually as we’re playing the characters, so we get to do it exactly how we want it. I did all of the motion capture for Leo and our Studio Manager, Oskar Wolontis, did the same for Vincent.
Not only did we do all of the mo-cap, we did of the games’ stunts as well. We didn’t have a stunt guy so all the jumping, fighting and everything else players will see in-game, is done completely by us. It was pretty tough going, but we just got on with it even though we hurt ourselves a few times.
If you believe in something and are passionate about it, then you’re prepared to do everything for it. Even if its long Walk and Run Cycle sessions for Motion Batch, where we captured every animation and movement possible, from crawling, to crouching, to sprinting, to jogging for almost twenty hours.
I don’t know how many hours I personally put into wearing that mo-cap suit, but it’s a crazy amount of time – so I guess I’m a mo-cap expert now. There are even some crazy stunts that we did, that I can guarantee are not supposed to be carried out by non-professionals!
What was it like seeing players reactions to Leo and Vincent and the game during E3 2017?
It was really great, everybody got excited, but I didn’t expect the reactions and levels of attention we received – which was very nice of course! When you work on something so hard and finally see people getting hyped for it, is a great feeling.
What kind of connection do you want players to have with Leo and Vincent?
Ha ha ha! If you want to test your relationship with someone, you should play A Way Out!
Who would win a fight between Leo and Vincent?
I’m not sure actually…I mean, Leo knows how to fight but Vincent’s a pretty tough guy too…I think it could be quite equal.
Immersive VR Story Installation Created in Partnership with Faceware Technologies, Leap Motion, and HP
MARCH 7, 2018 – Meow Wolf, the Santa Fe-based art collective, is set to debut its immersive VR story installation “The Atrium” at the 2018 South by Southwest Film Festival. Created in partnership with Faceware Technologies, Leap Motion, and HP, this revolutionary digital interactive experience will be available to festivalgoers at the JW Marriott in downtown Austin on March 13-15th.
“The Atrium” marks Meow Wolf’s groundbreaking foray into the world of interactive digital storytelling. With this new, state-of-the-art VR experience, the collective channels the creativity inherent to its acclaimed, large-scale installations into a new form of storytelling, and offers participants a uniquely visceral VR experience in return.
“We’re moving away from thoughts and patterns that reinforce passive media trends. The tools are finally here to create worlds that know you’re there, and stories that can stare right back at us,” said “Atrium” director Brian Solomon.
By entering “The Atrium,” SXSW attendees have the opportunity tocontinue exploring mysteries surrounding the Selig family by assuming the identity of a Charter agent, downloaded into the body of the families cybernetic pet. Adrift in the multiverse, they’re tasked with investigating Morgan, daughter of the mysterious family. “The Atrium” weaves in and out of Morgan’s empathetic world in an escapism-fueled fantasy.
To create this groundbreaking VR story experience, Meow Wolf designed a roomscale haptic platform to create a realistic sense of locomotion, and ground audiences in the virtual world by controlling the real space it occupies.Collaborated with Faceware Technologies, developers of Faceware Live, which gives studios like Meow Wolf the ability to produce live and interactive facial animation in realtime; Leap Motion, whose unprecedented hand tracking technology lets you reach into virtual and augmented reality to interact with new worlds; and innovative tech giant HP.
Together, they’ve created the ultimate haptic interactive story experience that leverages breakthroughs in technology with years of interactive design knowledge found within Meow Wolf’s Creative Studios. In designing “The Atrium,” Meow Wolf itself utilized a unique combination of narrative techniques developed around real-time engines, mixed reality, interactive scenography, installation art, music, and performance.
Written and directed by Brian Solomon, “The Atrium” was executive produced by Meow Wolf CEO Vince Kadlubek, and Nicolas Gonda, producer of Terrence Malick’s award-winning “The Tree of Life”and “Voyage of Time.”
SXSW attendees can experience “The Atrium”from March 13-15th at JW Marriott (110 E 2nd St. Austin, TX 78701) from 11AM to 6PM. For more information, please visit: https://meowwolf.com/the-atrium/
About Meow Wolf Entertainment
Meow Wolf Entertainment is an experimental production studio within the collective. The immersive digital team explores the edge where technology meets storytelling. They blend a combination of narrative techniques developed around real-time engines, mixed reality, interactive scenography, installation art, music, and performance. They are a future-facing story collective aiming to redefine entertainment.
About Faceware Technologies
Faceware Technologies Inc. (FTI), established in 2012 after years as part of leading facial tracking and augmented reality company Image Metrics, is dedicated to meeting the needs of professional animators in the video game, film, television, and commercial industries. The company’s Faceware Facial Motion Capture product line has been utilized in the production of hundreds of video game titles, feature films, music videos, commercials, television shows, and stage plays, and is the leading facial animation solution provider for clients such as Double Negative, Digital Domain, Blur Studios, Activision-Blizzard, Rockstar Games, Microsoft, 2K Sports, Electronic Arts, Ubisoft, Sega, Sony, Bethesda, Motion Theory and Moving Picture Company. Faceware’s product consists of the Faceware GoPro and Pro HD Headcam and Tripod Capture Systems; Faceware Analyzer, which allows clients to analyze and process their own performance videos; Faceware Retargeter, an Autodesk plugin which allows users to create facial motion capture data at a much faster rate than traditional methods; and Faceware Live, the real-time facial capture and animation solution.
About Leap Motion
Leap Motion’s mission is to build a natural connection between people and technology to unlock the potential of both. The company’s unprecedented hand tracking technology is a combined suite of software and hardware, allowing users to reach into virtual and augmented reality to interact with new worlds. Leap Motion was founded in 2010. The company is privately funded and headquartered in San Francisco. More information: www.leapmotion.com
HP Inc. creates technology that makes life better for everyone, everywhere. Through our portfolio of printers, PCs, mobile devices, solutions, and services, we engineer experiences that amaze. More information about HP Inc. is available at http://www.hp.com.
The Faceware powered Altered Carbon augmented reality experience for Netflix won big at the recent 2018 Lumiere Awards for Best Augmented Reality Experience. Created by MacInnes Scott and Here Be Dragons, the teams used Faceware’s Live realtime motion capture software to deliver the award-winning experience. The Lumiere Awards recognize outstanding international achievement in the creation of immersive storytelling using advanced visual technologies, including augmented reality, virtual reality, high dynamic range, stereo 3D, artificial intelligence, realtime rendering, and more.
Congratulations to all this year’s winners and see how the Altered Carbon experience is engaging with audiences on new levels of interactivity.
EA just released their Concept to Completion: Madden 18 Longshot behind the scenes piece, and we’re pretty thrilled to see our Faceware gear on these talented athletes and actors. Our longtime relationship with sports games has resulted in some spectacular releases, of which Madden 18 Longshot story is no exception! Here’s an excerpt of the piece – be sure to click through to read the full story HERE.
Madden 18 will have an immersive new story mode called Longshot, joining FIFA’s The Journey as single-player, cinematic experiences in EA SPORTS games.
We sit down with Longshot Mode creative director Mike Young and Producer Robin Cowie to get a behind-the-scenes look at how the mode came to life.
How long ago did the team start working on Longshot?
MIKE: Longshot as a concept is four years in the making. I worked with NFL Films to make a short concept video following a quarterback from the regional combine to draft day. This was before FIFA had its success with The Journey. I think the Madden leadership group saw potential when we delivered our opening playable cinematic in Madden 15.
What were some of the areas of focus for a producer on Longshot? How did you prioritize your time?
ROBIN: My primary job was removing blocks to enable the best possible creative experience. The guide for Longshot was to always tell the most cinematic story possible. We wanted to come as close as we could to a playable movie.
You produced The Blair Witch Project. How did producing Longshot compare to a movie?
ROBIN: There were many similarities—we worked with terrifically talented actors, like Mahershala Ali, and Mike was very prepared as a director. The biggest difference for me is the amount of creative control you have during the animation and digital camera parts of production. For most of a film, what is in front of camera accounts for 80 – 90% of the finished content. Only 20 – 30% have digital effects or digital manipulation. Creatively it gives you great freedom but it is a bit of a producing headache.
Price reductions, new services and bundled hardware/software packages headline the offerings
Los Angeles, Calif. – Feb 06, 2018 –Faceware Technologies, the leading provider of markerless 3D facial motion capture solutions, today announced brand new pricing for 2018. From price reductions on popular software licenses, to new services, to new hardware/software bundles, the new pricing offers something for indie animators and studios alike, and can be found on Faceware’s new pricing page. It is effective immediately.
“At the end of the 2017, we conducted a complete audit of our pricing, which included a survey of our customers and resellers, as well as a review of all of our SKUs,” said Peter Busch, vice president of business development at Faceware. “As you’ll see, we have made some pretty big changes that I think our new and existing customers will love.”
2018 Pricing Update:
Analyzer 3.0 Studio Plus Price Reduction: Analyzer 3.0 Studio Plus software licenses have been reduced more than 20%. The new cost for a Single-user license including the first year of support is now $6,500.
Live 2.5 Server Price Reduction: Live 2.5 Server has been reduced almost 10%. The new cost for a Single-user license including the first year of support is now $3,000.
New Service: Facial pipeline stage auditing. Faceware will perform an on-site review and analysis of your facial pipeline. Review of proper lighting, acoustics, as well as cable routing, operator control, and proper storage and maintenance of our headcam systems will be covered.
First Year Support Now Included: First year of support for all software and Complete Systems is now included in the price.
New Bundles – Additional Licenses at No Additional Cost: All Pro Complete Systems now include multiple network licenses of each of our software products. These include:
Realtime Pro Complete:
Two (2) additional Live 2.5 licenses
Creation Suite Pro Complete
Two (2) additional Analyzer 3.0 Studio Plus licenses
Four (4) additional Retargeter 5.0 Studio Plus licenses
Ultimate Pro Complete
Three (3) additional Analyzer 3.0 Studio Plus licenses
Five (5) additional Retargeter 5.0 Studio Plus licenses
Two for One on Network Licenses: All Network licenses of Analyzer, Retargeter, and Live will include two seats/users. Existing owners of Server licenses can upgrade by simply updating their support subscription fee
Original article published by fxguide and authored by Mike Seymour – January 28, 2018
The launch of Elton John’s Farewell Yellow Brick Road tour centered around a marquee event at Gotham Hall in New York City and was designed to coincide with the 60th Annual Grammy Awards. The event was also simultaneously transmitted to venues in Los Angeles and London. A VR live stream gave fans from around the world the best seats in the house and a rare opportunity to experience key moments in Elton’s 50 year career.
The event was significant as Elton John announced that this would be his final tour. He intends to stop travelling to spend more time with his family. To mark the event Spinifex group produced a landmark VR experience.
Spinifex Group designed a VR experience that would connect Elton’s past with his future. The immersive VR film takes the audience on a journey through Elton’s career. The production faithfully recreates some of Elton’s most iconic performances and memorable achievements. The team used cutting edge motion capture and facial visual effects techniques to enable the audience to experience these moments as if they were actually there.
“From our use of motion control, live action performance, and motion capture, to our choice of 3d packages, renderers and our 95 gpu / 1.1 petaflop GPU render farm, each technical aspect of this project was considered in the service of fulfilling a vision,”commented Alenka Obal Executive Producer. “First and foremost, our directors had an amazing vision for what this project might be. Our vfx supervisor and sequence leads worked from there to fit the tools to the tasks at hand, making sure that we could realize these visions effectively”.
What Obal believes the team found is that the production stereo spherical VR toolset for VFX is ever evolving. “Integrating live action and CG in stereo spherical VR is challenging to say the least. Embracing this challenge has deepened our knowledge and certainly peaked our interest for further exploration of VR in our craft”.
Elton’s Troubadour US Debut.
On August 25, 1970, Elton John made his U.S. debut in a legendary six-night sold-out run at West Hollywood’s Troubadour. The VR experience starts by re-creating this historical moment, even though there was no footage shot, and only some grainy still photographs to work from. Their solution was to have a body double for the young Elton and do a digital face replacement.
Although the Spinifex in-house team comes from character animation/photo real rendering backgrounds, this was their first CG face replacement project.
After much consideration and testing, the team decided to use the ZCam V1Pro for filming. A professional grade 360 camera system that utilizes 9 x 190° MFT fisheye lens with iris control. “Our stitched spherical resolution output was a hefty 7k” says Obal.
The production consisted of a 1day Motion Capture shoot in the UK and a 2 day Live Action (VR) shoot in Los Angeles, with a 5 day set-up/pre-light schedule.
As Elton John was deeply involved in the project, he agreed to provide original motion capture data that would allow the team to make sure the performance of the ‘digital Elton’ was true to the man himself.
All of the animation work and VR was done in-house at Spinifex, but the team partnered with various specialists, primarily for scanning and motion capture.
“Our partner House of Moves was responsible for face and body capture seen in referenced photos above. The body data is going into Vicon Blade (as seen above in the top right and bottom left photo ). Motion Builder was used after the shoot to process all the data” explained Obal.
The team partnered with 4D Max (in the United Kingdom) for the 3D scanning. The production used their rig called the Road Runner, which is a full body, high resolution mobile photogrammetry scanning system. It consists of 162 DSLR cameras, which provide the very high level of detail required for head and body scanning. The system was designed for creating photorealistic digital doubles. “We captured the full set of FACS expressions, which was essential for an expressive 3d animation rig” explained Obal.
The Road Runner is a mobile full body & high resolution head photogrammetry scanning rig that travels in a fully self-contained system inside an american style RV trailer. The mobile system allows 4DMax to be set up in as little as two hours.
Motion capture partner, House of Moves, created the facial rig that drove the FACS blend shapes via custom controls. The in-house animation team then animated on top of this base rig to refine the capture data. The lead face animators were Kirk Cadrette and Shoghi Castel De Oro. “Our facial animation team fine-tuned the capture data and used hand animation in parts where we wanted to more closely match Elton’s performance in Troubadour and Dodger Stadium”, adds Obal.
The Creative Lead was Ben Casey “This isn’t just tech for tech’s sake, the digital assets we are capturing and creating will extend Elton’s magic for generations to come” he explained. “There will be a clear line in history that will separate stories told inside the frame and the ones that are given dimension and unleashed into the world around us. The ability to capture true three-dimensional representations of a moment in time provides a previously unimaginable sense of presence. Without any of the 3D data that new techniques for image-making provide, we are eliminating our ability to take advantage of all emerging and future formats. Great songs endure the test of time and we believe bringing them to life in this way will enable people in 50 years to discover and experience the full impact of Elton’s music.”
The production pipeline consisted of several different packages and plugins, depending on the scene and the artist’s preference. Particle animation was done in C4D with the MoGraph module. Turbulence FD was used for volumetric fluid simulations and volumetric rendering was done in RedShift for C4D. Some of the additional particle work was done in Nuke as part of the composite.
The final renders were in V-Ray, and then composited in Nuke with CaraVR. The environment was rendered with RedShift and composited in part in Adobe After Effects.
Making of Video
Dodger Stadium 1975
Not only did the team need to create a digital Elton in the intimate Troubadour club but also Elton’s famous 1975 stadium spectaculars in LA. On Oct. 25 and 26, in 1975, Elton John performed two sold-out shows at Dodger Stadium. Elton John played a 10-song opening set that featured several album tracks before returning in a sequined Dodger uniform for a hit-laden second set that heavily highlighted his No. 1 album of the time, Captain Fantastic and the Brown Dirt Cowboy. His concerts, outrageous costumes and stage antics, were already well known and this concert pushed him to rock legend status. The team decided to reproduce the historic moment in front of a vast 55,000 person crowd, for Elton’s Saturday Night’s Alright for Fighting.
Recreating such an iconic moment was not easy and the team did extensive research and story boarding to recreate the epic stadium shots.
The stage sequence was a combination of live action greenscreen, face replacement and CG. (Above: a temp test / post viz look at the face replacement).
The final output of the project was delivered as 30fps at 5120 x 5120 resolution, the total piece is 6 mins in duration. When the 70-year-old British entertainer, revealed his plans at a gala New York event, he said he planned to “go out with a bang” with a global tour. He kicked off that tour with a mini-concert and Spinifex Group virtual reality presentation about his career and in that way he signalled that while he was looking back and still had one eye to the future.
We’re happy to celebrate our 6th year with all of you and would like to thank you for helping us continue to change the face of motion capture and animation! Going 6 years strong, discover what else turns 6 this year.
Initially launched as a Kickstarter campaign in August 2012, the Oculus Rift has since grown to become one of the premier virtual reality headsets and forever changed gaming.
Google Project Glass
Google Project Glass gave us a peek at the future of AR and ushered in the start of a new era of AR that has since given us worldwide sensations like Pokémon Go.
BM 500-Mile Battery
Using nanotechnology, the lithium air battery can propel an electric vehicle for 500 miles on a single charge. One of the biggest breakthroughs of the year it was announced, the technology is projected to be incorporated into the design of consumer vehicles in 2020.
With a jaw-dropping 0-to-60 time of 2.5 seconds and a top speed of 250mph, the new Ferarri flagship was reborn.
While technically created in 2010, it was in 2012 that the SpaceX Dragon became the first commercial spacecraft to successfully rendezvous with the International Space Station, marking the beginning of commercial space exploration.
Faceware Technologies is pleased to be returning to the Monteral International Game Summit (MIGS) this December 11-13, 2017 in Montreal, Canada. This summit brings together hundreds of international experts to one of the largest hubs of video game development in the world to share knowledge with the local and international developers attending the show.
Join us this December at Faceware Technologies booth 406, as we dive into the latest in facial animation and facial motion capture technology, including the groundbreaking technology behind the Star Citizen’s live player-to-player in-game character chat in which facial movements are instantly detected and streamed onto the character’s face with stunning realism.
Headed to Montréal and want to join us? We have a limited number of FREE passes to the show to give away. Email us at email@example.com and we may have a free pass for you.
One of the biggest games of the year, the eagerly anticipated Star Wars Battlefront II releases today for PlayStation®4, Xbox One, and PC. Gamers will be brought face-to-face with some of the most iconic characters in the Star Wars galaxy. Heroes from all three eras of the Star Wars universe take part in this epic adventure with characters old and new.
Star Wars Battlefront II features an all new cinematic story featuring Commander Iden Versio in a thrilling new narrative unfolding after the Return of the Jedi. As Commander of the elite special operations team, Iden and the Inferno Squad complete the missions that no one else can. Fearless, the Inferno Squad fights to restore order in the galaxy and save the Empire its most desperate hour.
Faceware is proud to have partnered with EA to bring the latest chapter in the Star Wars saga to life. With the latest installment in the Star Wars series, EA’s animation team raised the bar for facial animation, using Faceware’s ProHD Headcam System to enhance the performance capture of their live actors for the new cinematic story. Animators then applied the new level of detail to the characters’ performances using Faceware’s Analyzer 3.0 Studio Plus and Retargeter 5.0 Studio Plus software packages. The result is an impressive set of characters that create the brave new world of Star Wars Battlefront II. See more in a behind the scenes video of the making of Star Wars Battlefront II.
For over a decade, Faceware Tech has pioneered marker-less facial motion capture with our technology and expertise. Our software has been used on Oscar-winning visual effect films, award-winning AAA video games, commercials, music videos, and web series the world over.