Faceware’s Suite of Technology Instrumental in ‘L.O.R.D.’ Premiering Tomorrow

By | News | No Comments

Intro from FTI – We’re thrilled to see this exciting film launch tomorrow. Faceware Technologies has been intimately involved with the creative endeavors on this production. From on-site expertise by our own SMEs, to hardware, to software, Faceware Technologies is proud to be the facial motion capture technology selected for use on this epic journey.

“We believe the best facial animation comes from the combination of cutting-edge technology and an intuitive artist-friendly workflow.  By embracing artists’ passions and providing them with easy-to-use products, we have become the gold-standard for facial motion capture and animation tools,” shares Peter Busch – Faceware Tech VP, Business Development. “When the studio reached out to us, they already had the motion-capture technology pioneered by James Cameron’s Avatar lined up. What they needed was a solution for facial motion capture to seamlessly integrate. With our Creative Suite of products, we offered an end-to-end solution that worked at the speed of their production.”

Our congratulations to Le Vision Pictures and the creative team at Original Force on this monumental release!

Lionsgate to Release Chinese CGI Fantasy Film ‘L.O.R.D.’ in North America (Exclusive)

THR_logo

by Patrick Brzeski

Directed by Chinese hitmaker Guo Jingming, the Le Vision Pictures film stars Fan Bingbing and Kris Wu and is presented entirely in motion capture CG.

Lionsgate is importing some cutting-edge Chinese young adult fantasy to North America.

The studio will release 31-year-old Chinese filmmaker Guo Jingming’s fantasy feature L.O.R.D., starring Fan Bingbing, in cinemas across the U.S. and Canada on Sept. 30. The movie will open in China on the same day.

“Lionsgate appreciated the film’s potential for YA audiences around the world,” said a representative from Le Vision Pictures, the movie’s producer, which is handling distribution in Greater China. “They felt it fit in well with their track record of success with films in this genre, such as the Hunger Games and Twilight franchises.”

Lionsgate is understood to be a minority investor in the film and will also bring it out in the U.K., Ireland, Australia and New Zealand on Sept. 29. The movie will also set for release in Singapore and Thailand on Oct. 13, followed by Myanmar on Nov. 4. Le Vision declined to share the number of screens on which L.O.R.D. would open in the various territories.

An acronym for “Legend of the Ravaging Dynasties,” L.O.R.D. is based on a best-selling series of YA novels written by Guo and released in 2010.

A writer-turned-director-turned-media mogul and celebrity, Guo is the creative force behind China’s Tiny Times film franchise, which chronicles the life of aspiring, highfashion-obsessed young women in Shanghai. Produced on small budgets and released in a rapid clip between 2013 and 2015, the four films in the Tiny Times franchise collectively earned $291 million in China alone. The commercial feat was all the more impressive given that Guo had no directing experience prior to working on the first movie in the series.

Along with Fan, the new film features a slew of China’s new young stars, such as Kris Wu, Yang Mi, Chen Xuedong and Wang Yuan.

“The foundation of this strategic partnership between Le Vision Pictures and Lionsgate is that we both want to help young Chinese producers and directors move forward into the global market, bringing the best original content to Generation Z (young people born after 1995),” said a rep for Le Vision.

L.O.R.D. also presents a technical first for the Chinese industry: although it features many of China’s most famous faces, the film was shot and produced entirely in CG using motion-capture technology pioneered by James Cameron’s Avatar.

The story is set in a world of mysterious sorcery that is split between four nations — Water, Wind, Earth and Fire. The story follows the adventures of young disciples of the seven top sorcerers who preside and fight evils forces within this imaginary realm.

 

 

Faceware Helps Bungie Bring New Faces to Destiny: Rise of Iron

By | News, Press Release | No Comments

lord_saladin_2016-09-22_02-02-58-pm 1324x745Los Angeles – Sept 20, 2016Faceware Technologies, the leading innovator and most experienced provider of markerless 3D facial motion capture solutions, today announced that video game developer, Bungie, Inc., has used Faceware’s ProHD Headcam System, along with its Analyzer 3.0 and Retargeter 5.0 Studio Plus software packages, to enhance the facial expressions and emotions of Lord Saladin and the new Iron Lords in Destiny: Rise of Iron.

Destiny: Rise of Iron is a major expansion for Bungie‘s first-person shooter, Destiny. Releasing today for PlayStation®4 and Xbox One, Destiny: Rise of Iron features an all new cinematic story campaign set within The Plaguelands, a brand new location on Earth. Under the command of Lord Saladin, players will face a new faction of Fallen Devils, the Splicers, while unravelling the mystery of the Iron Lords. Rise of Iron features new weapons, armor, and gear, as well as a new cooperative three-player Strike, a new mode and maps for the Crucible competitive multiplayer, and an all new six-player cooperative Raid.

Destiny: Rise of Iron also brings new levels of photorealism to the game characters in the new story campaign and cutscenes. Faceware Technologies’ markerless motion capture system was previously used on Bungie’s Destiny. With this expansion, Bungie’s animation team upped the ante, using Faceware’s ProHD Headcam System to enhance the performance capture of their live actors. They then applied that new level of detail to the performance of the Iron Lords using Faceware’s Analyzer 3.0 Studio Plus and Retargeter 5.0 Studio Plus software packages. The result is a visually stunning set of characters that invest players ever more deeply in the world of Destiny.

We’ve worked closely with Bungie over the years and they never cease to strive for the very best. That commitment shows in the characters they’ve created for Destiny: Rise of Iron,” said Pete Busch, VP of Product Development at Faceware Technologies. “All of us at Faceware are honored that Bungie has chosen our products yet again to bring their game characters to life.”

Faceware’s software products identify the movement of an actor’s face from video and apply that movement to a computer-generated character. Together with its head-mounted and stationary cameras, Faceware’s technology is being used successfully in award-winning movies like The Curious Case of Benjamin Button and The Walk, and top-grossing games like Grand Theft Auto III-V, NBA 2K10-2K16, Destiny, Batman: Arkham Knight, Call of Duty: Advanced Warfare and DOOM.

For more information, visit www.facewaretech.com or contact sales@facewaretech.com.

Information on Destiny: Rise of Iron can be found at Bungie’s official website: https://www.destinythegame.com/ca/en/rise-of-iron

###

About Faceware Technologies

Faceware Technologies Inc. (FTI), established in 2012 after years as part of leading facial tracking and augmented reality company Image Metrics, is dedicated to meeting the needs of professional animators in the video game, film, television, and commercial industries. The company’s Faceware Facial Motion Capture product line has been utilized in the production of hundreds of video game titles, feature films, music videos, commercials, television shows, and stage plays, and is the leading facial animation solution provider for clients such as Double Negative, Digital Domain, Blur Studios, Activision-Blizzard, Rockstar Games, Microsoft, 2K Sports, Electronic Arts, Ubisoft, Sega, Sony, Bethesda, Motion Theory and Moving Picture Company. Faceware’s product consists of the Faceware GoPro and Pro HD Headcam and Tripod Capture Systems; Faceware Analyzer, which allows clients to analyze and process their own performance videos; Faceware Retargeter, an Autodesk plugin which allows users to create facial motion capture data at a much faster rate than traditional methods; and Faceware Live, the real-time facial capture and animation solution. www.facewaretech.com

About Bungie

Bungie was founded in 1991 with two goals: develop kick ass games that combine state-of-the-art technology with uncompromising art, captivating storytelling, and deep gameplay, and then to sell enough copies to fund our ongoing quest for World Domination. Over the past twenty-five years, Bungie created a bunch of fun games, including the Halo franchise, the Marathon Trilogy, and the first two Myth games. Our independent, employee-owned development studio is located in Bellevue, Washington, the base where we launched our most ambitious project to date: Destiny.

More information on Bungie can be found at www.bungie.net.

ANIMATION VERTIGO PARTNERS WITH WORLD-RENOWNED FACEWARE TECHNOLOGIES INC.

By | Press Release | No Comments

AnimationVertigo_logo

Industry Leading Motion Capture Firm, Animation Vertigo, Adds Powerhouse Faceware Tech To Growing Roster of Best-In-Class Vendors

LOS ANGELES – September 12, 2016Animation Vertigo, a U.S.-based outsource management company that provides high quality motion capture animation to entertainment industry leaders, is proud to announce its partnership with Faceware Technologies Inc., the leading innovator and most experienced provider of markerless 3D facial motion capture solutions. Faceware Tech joins a growing roster of Animation Vertigo’s hand-picked and vetted partners, further establishing the company’s stellar reputation as a leader in the motion capture industry with access to best-in-class partners and vendors.

“Faceware Tech is one of the most respected firms in the industry that has pioneered marker-less facial motion capture for games, films, commercials and movies,” said Marla Rausch, CEO of Animation Vertigo. “Through our strategic alignments, we are continuing to stay cutting edge in the industry and look forward to delivering unparalleled motion capture animation services to our clients with these partners by our side.”

Based out of Nevada, California and Texas, Faceware Tech is one of the leading facial motion capture solutions in the industry. The company’s facial motion capture software and hardware have been used in hundreds of video games, commercials, music videos, feature films and stage play productions..

“Our decision to work with Marla and her team at Animation Vertigo was a natural fit,” said Peter Busch, VP of business development for Faceware Tech. “We have a number of exciting projects in the pipeline, and Animation Vertigo’s client roster is one of the best in the motion capture world. We look forward to leveraging this partnership and continuing to raise the bar for our clients.”

The motion capture industry is constantly evolving, and Animation Vertigo is at the forefront with new projects and innovations. To stay up to date with the latest on Animation Vertigo, please visit the company’s Facebook page or website.

About Animation Vertigo

Animation Vertigo is an outsource management company that provides high quality and reliable solutions for motion capture and animation needs. The company’s production center is located in Manila, Philippines, tapping into tremendous talent, resources and artists. Animation Vertigo has an unmatched reputation for delivering professional, experienced and timely motion capture solutions to its clients. Visit www.animationvertigo.com for more information.
About Faceware Technologies

Faceware Technologies Inc. (FTI), established in 2012 after years as part of leading facial tracking and augmented reality company Image Metrics, is dedicated to meeting the needs of professional animators in the video game, film, television, and commercial industries. The company’s Faceware Facial Motion Capture product line has been utilized in the production of hundreds of video game titles, feature films, music videos, commercials, television shows, and stage plays, and is the leading facial animation solution provider for clients such as Double Negative, Digital Domain, Blur Studios, Activision-Blizzard, Rockstar Games, Microsoft, 2K Sports, Electronic Arts, Ubisoft, Sega, Sony, Bethesda, Motion Theory and Moving Picture Company. Faceware’s product consists of the Faceware GoPro and Pro HD Headcam and Tripod Capture Systems; Faceware Analyzer, which allows clients to analyze and process their own performance videos; Faceware Retargeter, an Autodesk plugin which allows users to create facial motion capture data at a much faster rate than traditional methods; and Faceware Live, the real-time facial capture and animation solution.

 

# # #

Faceware Presenting at External Development Summit (XDS) 2016

By | News | No Comments

XDS 2016 – Vancouver, Canada

xds_logoFaceware is honored to be presenting with Electronic Arts and Animation Vertigo at XDS 2016. The joint presentation will discuss their facial animation pipeline. This session covers their evaluation and migration of their performance capture, animation, and outsource workflows related to their face pipeline.

Agenda Excerpt

11:30am-12:00pm | Tech Innovation Rapid Fire: EA & Faceware – Automating Facial Animation | Main Stage

Peter Busch (Faceware), Greg Wellwood (The Capture Lab), Marla Rausch (Animation Vertigo) & Sam Mynott (The Capture Lab): Tech Innovation Rapid Fire: EA & Faceware – Automating Facial AnimationThe talk will serve to celebrate and examine the collaboration between EA and Faceware Technologies in establishing a high-end facial animation pipeline through demonstrations of real-world examples.  Discussion points will be on the identification of the pipeline and known production challenges, and the eventual migration of every facet involved in the pipeline including motion capture, performance capture, rigging, batch-automation, and the inclusion of one of their key third-party outsource studios, Animation Vertigo. The talk will illustrate how EA’s automation-wrapper increases throughput dramatically to meet the demands of their many ongoing productions.

For more information check out the complete agenda for XDS 2016

Faceware Co-Sponsors We Are Code: VR Hackathon

By | News | No Comments

We’re proud to be a sponsor of  We Are Code: VR Hackathon together with companies like Oculus this Labor Day weekend!

Event Details

Register now for the Dhat Stone Academy/Priority We Are Code VR Hackathon! We are enlisting students in Compton to compete with students in Oakland. They will engage in a computer coding competition to create the best mobile app, virtual reality games & website ideas that solve a social or economic problem in their communities.

Film actor, dancer and singer Ben Vereen will be on-site in Compton California on both competition days to entertain and give an inspirational talk.

There will be Virtual Reality technology workshops by NanoVR, and Animation development workshops from the famous Grammy Award winning Animator Leo Sullivan, producer of Fat Albert and the Cosby Kids.

At the We Are Code Virtual Reality Hackathon students will also experience Exhibition Virtual Space, a futuristic glass structure with waterfalls, grassy hills, and lakes. Compton and Oakland students will communicate and interact through a full immersive Virtual Reality experience. 

We are thrilled to announce that Oculus will be sponsoring the  prizes for the first place winners with VR Gear and other cool stuff!

Winners will receive cash prizes, T-shirts, Google VR Cardboard, gadgets, free coding classes, and a chance to turn your winning app idea into a business!

Join us!

 

WHERE
NASA Columbia Memorial Space Center – 12400 Columbia Way, Downey, CA 90242 – View Map

Faceware Rountable Discussion: Visuals of the Future

By | News | No Comments

develop_logoBy Matthew Jarvis

We’re fewer than three years into the latest console generation, but already another major graphical revolution is afoot as VR, mobile advancements and superpowered versions of the PS4 and Xbox One push fidelity forwards. Matthew Jarvis asks Epic, Unity, Crytek, Faceware and Geomerics what’s next?

Alongside Apple’s product range and the Sugababes line-up, computer graphics are one of the quickest-changing elements in modern culture. Games from even a few years ago can appear instantly dated after a new shader or lighting model is introduced, making it crucial for devs to be at the forefront of today’s visual technology.

Yet, as with all of development, taking full advantage of console, PC and mobile hardware comes at a cost – a price that continues to rise with the growing fidelity of in-game environments and latest methods being used to achieve them.

“Triple-A games with fantastic graphics are becoming more and more complex to develop and require increasingly larger and larger teams and longer development time,” observes Niklas Smedberg, technical director for platform partnerships at Unreal Engine firm Epic Games. “This means that it is more and more important to work efficiently and smartly in order to avoid having your team size or budget explode.

“Technologies that have stood out from the rest include new various anti-aliasing techniques like temporal AA, which has made a huge impact on visual quality.”

Unity’s Veselin Efremov, who directed the engine firm’s Adam tech demo, believes “the latest push is around adding layers of depth and evocative power to environments and virtual worlds”.

“There are things such as screen space reflections, and real-time globalillumination,” he says. “It can all be done with a relatively low memory footprint.

“Meanwhile, physically-based rendering is at its core a new approach to the lighting pipeline, simulating the natural way light and real-world materials interact. PBR shines when you’re pursuing realism.”

It is important to work efficiently to avoid having team size or budget explode.

Supporting the latest methods of mastering specific optical elements are new underlying APIs that are propelling the entire sector forward.

“The expanding performance budget of each platform means that every generation developers can build larger worlds with more geometry, richer textures and more advanced effects,” says Chris Porthouse, GM of lightingspecialist Geomerics. “The advent of thin APIs such as Metal and Vulkan are giving developers access to finer control of the performance potential of these devices.”

Although the inflating scale and ambitions of titles has led to a need for greater investment in graphical technology, Dario Sancho-Pradel, leadprogrammer at CryEngine outlet Crytek, highlights the advancements in making tools more accessible and easy to deploy.

“Recent technologies are helping character and environment artists to produce more complex and interesting assets in a shorter amount of time,” he says. “Some examples that come to mind are procedural material generation, multi-channel texturing and photogrammetry.”

Peter Busch, VP of business development at Faceware, concurs that “content creation tools are getting easier to use, less expensive, and now support multiple languages”.

“That means that more people will have access to the hardware and software needed to create good facial performances,” he says. “That, in turn, means facial animation, whether pre-rendered or rendered in real time, will appear in more indie games.”

develop_smallmighty

SMALL BUT MIGHTY

For decades, it was a widespread belief that indie studios couldn’t hold a candle to the graphical might of triple-A powerhouses. The last few years have destroyed this myth, as the increasing power and falling price of development software have closed the gap, allowing devs including The Chinese Room and Frictional Games to produce outstandingly good-looking titles such as Everybody’s Gone To The Rapture and Soma.

“Development tools have advanced tremendously in the past several years,” says Efremov. “There’s really very little stopping a creator from bringing their vision to market; it takes less time and fewer resources than ever before.

“What really distinguishes big triple-A productions is content and an emphasis on realistic graphics. There are ways smaller studios can replicate triple-A quality, such as PBR-scanned texture set libraries, terrain creation tools, automated texturing tools and markerless motion capture solutions.”

Smedberg offers a reminder that a strong style can be as effective as pure graphical horsepower.

“Use something you feel comfortable with and lets you be most productive,” he advises. “Choose an art style that fits you best and where you can excel. All developers should take the time to use graphics debuggers and profilers. The more you can learn about the tools and your rendering, the better your game will look.”

Porthouse agrees that “workflow efficiency is everything to smaller studios”.

“Any tool that can decrease the time it takes to perform a particular process saves money and leaves developers free to improve other areas of the game,” he explains.

Sancho-Pradel says: “Small studios need to be particularly pragmatic. If they use a third-party engine, they should be aware of the strengths and pitfalls and design their game around them. Using techniques such as photogrammetry, procedural material generation and multi-channel texturing is affordable even in low-budget productions and can significantly increase the quality of the assets if the art direction of the game is aligned with such techniques.”

Each year the challenge of delivering visibly better graphics increases and workflow efficiency is the key to success.

CONSOLE CALIBRE

As thread upon thread of ‘PC Master Race’ discussions attest, the trusty desktop has long been considered the flagbearer for bleeding-edge graphics, with the static hardware of consoles acting as a line in the sand beyond which visuals can only toe so far. This could all be set to change, as Xbox’s newly announced Project Scorpio and the PS4 Neo look primed to blur the lines between console generations, in line with the continually evolving nature of PC.

“As hardware manufacturers produce new platforms, development teams will happily eat up the extra performance budget to generate higher fidelity visuals,” Porthouse predicts. “Yet step changes in hardware do not come annually and for every release a franchise’s studio needs to demonstrate graphical improvements even when new hardware is not available.”

John Elliot (pictured), technical director of Unity’s Spotlight team, suggests: “For large studios, there is a great opportunity to stand out by pushing the new hardware to the maximum. For smaller studios, more power provides opportunities in different areas. Many will find creative ways to drive interesting and new visuals.”

Among the headline features of the Xbox Scorpio are high-dynamic range and the ability to put at 4K resolutions. With many titles yet to achieve 60 frames per second performance at 1080p on console, will these new display options even be of interest to devs?

“This was the only option they had to move forward,” Smedberg proposes. “HDR requires new display hardware and 4K requires more performance. While more performance is always better, I am concerned about fragmentation – that consoles will end up like PCs, with tons of different performance and feature levels that developers have to figure out how to make our games play nice with.”

Elliot backs the introduction of HDR and 4K as “interesting features that absolutely should matter to devs”.

“While the jump to 4K is much less than that from SD to HD TV, it is still something that gamers will soon come to expect,” he forecasts. “You just have to look at how quickly 1080p became the standard for TV to know that there will be huge demand for 4K content.

“HDR really does provide an amazing boost in visual quality, but it is much harder to visualise what this means until you have actually seen it. I’m sure it will quickly become a standard requirement and, as with any new hardware, there is a great opportunity for the first games to support it to get noticed in the marketplace.”

THE REALITY OF VR

As well as HDR and 4K, a major turning point with Scorpio and Neo is improved performance for VR hardware. With good reason: the nascent medium is reliant on flawless performance. It’s a challenge that remains so even with the almost limitless potential of PC, as devs work to balance realistic visuals with hitch-free delivery.

“Photorealism is not essential for VR, but if one of your goals is to immerse the player in a highly realistic world, then the game will have to use a highly optimised rendering pipeline in order to hit the framerate required by the VR platform while rendering in stereo high-detailed models, thousands of draw calls and complex shaders,” observes Sancho-Pradel. “DirectX12 and Vulkan are designed to give engineers much more control in terms of memory, command submission and multi-GPU support, which can translate into more optimal rendering pipelines.”

Busch echoes that absorbing players into a VR experience is vital for devs – and visuals play a major role in doing so. 

“Without believable, immersive experiences, games will face an uphill battle in VR,” he cautions. “In a fully immersive environment people are ‘in’ the content – which only means that the details, framerate and level of engagement are a daunting task. Mix that with a quagmire of VR hardware, and it is a difficult landscape to develop in – to say the least.”

Smedberg drives home the point that performance is virtual reality’s unavoidable challenge, and offers technical advice to devs looking to perfect their game’s smoothness.

“The performance requirement for VR is critically important,” he confirms. “If you miss your target framerate here and there in a regular game, you might have a few unhappy customers. If you miss your target framerate in a VR game, your customers may not just stop playing your game – they may stop playing VR altogether.

“With VR you have to design your game graphics with a lower visual fidelity in mind, because you have twice as many pixels to fill and need to have it all rendered in less than 10ms – instead of 16 or 33ms. Many tried and proven rendering techniques won’t work, like sprite particles, because in VR you can see that they’re just flat squares.

“Better technology can help; faster and more efficient rendering APIs could help your game run much faster on the CPU – you can make more drawcalls and spend more time on AI or physics. GPUs could take advantage of similarities in the shader between the two eye views and run faster on the GPU.”

There is a great opportunity for the first games to support HDR to get noticed in the marketplace.

RESOLUTION REVOLUTION

With a new semi-generation of consoles around the corner, virtual reality continuing to redefine long-established development philosophy and technology always set to take the next major graphical step, what can developers expect to rock the visual world?

“Lighting continues to be one of the most effective, emotive and evocative tools in an artist’s bench,” suggests Efremov. “Real-time global illumination,physically-based materials, lights, and cameras/lenses will all carry us much closer to worlds and environments that truly replicate our own.

“Machine learning is such an exciting new area of research, already put to practical use in so many other industries. The possibilities in the future are countless; simulations for populating and simulating game worlds based on photographic reference, expanding animation performances, creating 3D assets, altering the visual style of a game, new levels of AI and so on.”

Porthouse echoes Efremov’s sentiment that lighting enhancements will be one of the key drivers helping to refine visual fidelity.

“As teams perfect the graphics of their increasingly large world games, lighting and weather effects will play an important part,” he says. “In two to three years’ time all exterior environments will be lit with a real-time day/night cycle with believable and consistent bounced lighting, and gamers will be able to seamlessly transition between indoor and outdoor scenes. Each year the challenge of delivering visibly better graphics increases and workflow efficiency is the key to success.”

Smedberg (pictured) highlights hardware evolution as a strong foundation for devs to expand their graphical ambitions.

“We may see PC graphics hardware that can run at double speed, using 16-bit floating point instead of 32-bit floating point,” he forecasts. “This doubling of performance comes in addition to any other performance increases, like more cores and higher clock frequencies.

“On the rendering technology side, I’m looking forward to using some forms of ray-tracing – not traditional full-scene ray-tracing, but as a partial tool for some specific rendering features.”

Smedberg concludes by reiterating that while pure power is sure to drive visual punch onwards, its benefits are voided without strong design to match.

“Graphics technology is so capable and powerful today that developers have an incredible freedom to choose whatever art style they like,” he enthuses. “Creators can and should choose a style that they feel passionate about and that fits their gameplay and budget goals.”

How Hasbro Animated Mr. Monopoly’s Facebook Live Broadcast in Real Time

By | News | No Comments

Digital_Trends_logoBy

 

Facebook Live is proving to be a powerful tool for for sharing moments in real-time — but what happens when the user is actually a cartoon? Hasbro recently used a trio of different technologies to allow an animated Mr. Monopoly (the game’s top hat wearing, mustachioed mascot) to share a live announcement, interact with commenters, and answer questions in real time.

Last week’s live video, which announced the Monopoly Ultimate Banking Game that replaces paper money with bank cards, used not one but several different technologies to create the animation in real-time. The video included facial expressions as well as full body movements and personalized interaction with viewers.

Related: Sonic the Hedgehog hits theaters in 2018, mixing live-action and animation

“In the Monopoly Facebook Live broadcast, Mr. Monopoly was brought to life with the help of a combination of physical motion capture, facial motion capture, and real-time rendering,” Rebecca Hollander, the Monopoly brand’s director of marketing, told Digital Trends in an email. “What you see and hear from Mr. Monopoly in the video is completely live and the character also spoke directly to fans, answering questions and acknowledging their comments during the broadcast.”

Mr. Monopoly’s whole body movements were created using a wearable motion capture device from X-Sens. The system captured the real character’s movements using a series of sensors, including arm and leg bands and a cap. As the actual person moved around inside Hasbro’s Cake Mix Studios, the animated Mr. Monopoly moved around the cartoon stage.

A wearable created the larger motions, but what about the facial expressions? Faceware Live 2.0 was used to analyze a video of the announcer to translate onto the animated character. The system quickly calibrates the person’s facial features, identifying where everything from the lips to the eyebrows are. Then, the system continues to follow those facial markers, translating them to the animated character. Unlike the wearable, the facial expressions are monitored with video and a computer system.

While X-Sens and Faceware was used to capture the motion, the Unreal game engine helped put it all together, mixing the motion of Mr. Monopoly with the animated Scottie the dog in the “video crew.”

“A challenge with using animation with a live broadcast is that there are many layers of technology that have to work together to deliver a quality final product,” Hollander said. “There are also many moving parts that come together to make it work, for example, creating an engaging script and understanding how facial and body movements translate into an animated character on screen. It was also important for Mr. Monopoly to answer questions and engage in real time to give an authentic experience to our fans.”

Hollander said the 80-year-old brand is continually looking for ways to stay fresh and relevant — and Facebook’s live feature was a good way to do just that while engaging with fans in real time.

Along with the release of the new game with bank cards, the live announcement also included a new sweepstakes where participants can win up to $20,580 (the amount of cash inside a traditional Monopoly game).

// //

Faceware Tech Adds to the Magic in Kingsglaive Final Fantasy XV

By | News | No Comments

It’s always a thrill to see our technology used in productions. Check out the official trailer and intro clip of Kingsglaive Final Fantasy XV – from Sony Pictures Home Entertainment.

And the Official Teaser Trailer…

Computer Graphics World Selects Faceware to SIGGRAPH “Best of Show” List

By | News | No Comments
Silver-Edge-Logo_2016_siggraph812-580345-13-TransparentWhite-1

Computer Graphics World Selects Faceware to SIGGRAPH “Best of Show” List

The excitement and hustle and bustle of SIGGRAPH 2016 is now behind us. After taking a bit of time to review all the information collected from the show, the staff is ready to name the winners of the  CGW  Silver Edge Awards, given to companies whose products and vision have the potential to greatly impact the industry.

This year, after much consideration, the following companies and their respective technologies have earned the designation of best of show at the 43rd annual SIGGRAPH conference and exhibition:

AMD’s Radeon Pro WX Series: Professional graphics cards that are optimized for open-source software and tuned for the demands of modern content creation. Also, AMD’s Radeon Pro SSG: A transformative solution that will offer greater memory capacity for real-time postproduction.

Autodesk’s Maya 2017: Packed with many new features, including the Arnold renderer which was recently acquired by Autodesk, as well as new tools for animators, updates to the motion graphics toolset, and more.

Faceware’s Faceware Interactive technology: This new interactive division is focusing on hardware and software that will enable characters to interact with people in real time (ideal for controlling characters in VR).

Maxon’s Cinema 4D R18: A big new release featuring enhancements to the MoGraph toolset, including the new Voronoi fracture system, object tracking within the motion tracker, and more.

Meta Company’s Meta 2: Augmented-reality headset that may give Microsoft’s HoloLens a run for its money.

Nvidia’s Quadro P6000: A VR graphics card said to be the fastest pro card available, harnessing 3,840 Cuda cores. Kudos also for making Mental Ray available directly from Nvidia, running on the GPU. And, for its SDK updates, particularly VRWorks 360 SDK.

Pixar’s RenderMan 21: Touted as the biggest RenderMan release in years, V21 gives users access to some of the same technology used on the studio’s latest feature film, including access to shaders and light types. Also, Pixar announced the open-source release of Universal Scene Description (USD), technology used for the interchange of complex 3D graphics data through various DCC tools.

The Foundry’s Cara VR: A VR plug-in toolset for Nuke. Also, the company should be applauded for shipping Katana to Windows as well as Linux.

WorldViz’s VizConnect: For enabling multi-platform use of VR across multiple systems , including the Oculus Rift and HTC Vive.

Special recognition: While not all of the game engine companies had a presence on the show floor, three in particular deserve special recognition for the work to their engines in advancing real-time graphics and interactivity: Epic Games, Unity Technologies, and Crytek. 

cgw-logo-blue

3D Animation Workshop Offered by CGSociety

By | News | No Comments

Our friends over at CGSociety are offering an amazing animation course we wanted to share:

Character Facial Rigging for Production

Course Instructor – Wade Ryer

Wade Ryer has over 10 years of experience in animated feature films. His credits include How To Train Your Dragon 2, Rise of The Guardians, Kung Fu Panda 2, The Tale of Desperaux and The Ant Bully. He has worked for such studios as DreamWorks Animation, Method Studios, Framestore, Electronic Arts, DNA Productions and ReelFX Creative Studios.

Course Overview

Learn how to rig character faces and bring them to life with Wade Ryer. Wade has over 10 years of experience in animated feature films such as; How To Train Your Dragon 2, Rise of The Guardians, Kung Fu Panda 2, The Tale of Desperaux and The Ant Bully. He has worked for such studios as DreamWorks Animation, Method Studios, Framestore, Electronic Arts, DNA Productions and ReelFX Creative Studios.

Upon completion of this 8 week CGWorkshop, you will have a full understanding of how to create a functional, animator friendly, character face rig ready for their close up on the big screen!  Wade will give you guidance along the way, industry level tips and tricks, not to mention, personal feedback on your character’s facial rig.

CGSociety Animation Course

Note: The format of this class will be Forum replies for Assignments and Forum replies for student Q&As.