Xsens product manager Hein Beute discusses the relationship between the two mediums and their uses of motion capture

What Hollywood is learning from video games mo-cap

How has film technology benefitted video games? What has the games industry learned from Hollywood?
For years, the game industry has chased the film industry in terms of quality. The film industry was the first to cross the Uncanny Valley by developing sophisticated pipelines, manned by artists who paid close attention to rigging and retargeting. In the last couple of years, though, this has changed. Now technology is driven more by the games industry.

Real-time game technology is used in pre-vis, live production. A great example is the live GDC 2016 demo for Hellblade by Ninja Theory, created using Unreal Engine 4 and real-time mo-cap technologies from Cubic Motion and Xsens. This demo showed the potential for a creative director to direct a virtual scene live, with performers creating digital avatars in real-time. It allowed creative decisions to be made live on-set so teams could get the best performances possible, and it also showed that this level of sophistication can be done affordably.

Game engines are now more motivated to achieve film-quality realism and incorporate tools that are useful to filmmakers and beneficial to the movie production workflow.

Epic and Ninja Theory have done a beautiful job of blurring the boundaries between film and gaming production technology. Their real-time animation performance presented at GDC was a stunning demonstration of what is possible using live motion capture and Unreal. They’ve developed this concept even further by demonstrating that a scene can be shot, captured, edited and rendered to film production quality in minutes instead of days or weeks. 

Access to real-time capture not only saves time and money, but it also preserves the integrity of the actor’s performance. It makes it much easier to capture the subtle nuances of timing, emotion and improvisation. And as the setup is portable and affordable, this integration makes it easy for small teams and indie filmmakers to produce films on par with major studios.

How are game engines like Unreal, Unity, CryEngine gearing themselves towards motion capture? What have been the biggest/most impactful improvements?
Since engines have their origins in games, that was obviously their immediate focus. However, over the years these engine makers have recognised that there are a ton of applications outside of the game industry, and so they have become more than game engines.

For instance, the game engines have made a concerted effort to incorporate live mo-cap data as a way to drive real-time avatars. Since the level of quality created by the engines is so realistic, the mo-cap-to-animation pipeline requires that same level of realism. Retargeting tools and integration of a third-party tools, like IKinema, bring a high-level finish to the job.

An early adopter was DJ Skrillex, who in 2011 became the first person to introduce a live motion capture act to his onstage performance using Unreal and Xsens technology. A few recent examples of real-time projects outside of the game industry are the live-animated Monopoly broadcast and Björk’s live, motion capture streamed press conference, which was a world first.

In the Monopoly project, Xsens and Faceware were used to capture the motion. The Unreal game engine then helped put it all together, mixing the motion of Mr. Monopoly with the animated Scottie the dog in the “video crew.” In Björk’s press conference, she wore an Xsens inertial motion capture suit in Reykjavik and appeared as a digital avatar live in front of an audience in London. MotionBuilder was used to apply the mo-cap data to a skeleton, and the bone transforms were then streamed into a character running in Unity to render and provide the final output.

Game engines are also gearing themselves to VR and motion capture. To provide a more realistic experience in VR adding motion capture brings it to a whole new level. By not only immersing your sight but your whole body, people feel more part of the experience, like in the Toyota VR experience designed by Antiloop studio. VR applications demand the same level of realism as in film to overcome the Uncanny Valley.

The film industry was the first to cross the Uncanny Valley, but now technology is driven more by the games industry.

How can motion capture firms work with these engine providers to help develop this? How can they contribute?
The focus at Xsens has always been on ease of use, so integrating with games engines has been relatively easy. We are in close contact with the major game engines makers, and are constantly looking to improve the live and offline integration.

Motion capture quality has improved constantly over the years, just like the game engines have. Xsens MVN has some unique characteristics that make it ideal for working with game engines, beginning with the ability to work anywhere. It can connect to a local wi-fi network and send data to a computer, so game developers don’t need a full studio space to work in. The mo-cap data is recorded, sent wirelessly and then entered into the game engine. We make it as easy as possible.

MVN’s first applications were also in previs and live production. As motion capture quality became better, it was used increasingly in game development and film production, as well as live entertainment. The developers at Xsens are always looking for ways to improve motion capture performance, and we believe that we are close to achieving consistent film production quality in real-time under any circumstance.

What are the next major milestones for game engines/motion capture if we want to catch-up with Hollywood? 
Next to high-quality graphics and fluid motions, it is crucial for a tool to be easy-to-use. If we want to see our tools being used by creative people, they should not be bothered with technical details. It just needs to work, so they can focus on the creative process, like making creative decisions in real-time. Next steps are to make sure the different tools integrate seamlessly. It’s important that live productions can be done by studios of all sizes.

How will this benefit games beyond just enabling better performances?
Sharing technologies in both industries will encourage sharing creative methods of storytelling, VFX pipelines, and camera techniques resulting in more engaging content.

As the technology improves, does the standard of acting/writing in video games need to rise? What’s the best way of accomplishing this?
As the technology improves and becomes easier to use, it will become more of a tool in the hands of creative people, rather than demanding attention itself; it won’t be a distraction. The attention of the whole crew will be on the performance and storylines. This can be witnessed in the Hellblade demo and the movie Ted (pictured above), where the mo-cap is done on stage so the actors can interact naturally and get timing correct which is crucial for a comedy.

Are improvements in mo-cap technology and game engines making the tech more accessible to smaller, low-budget studios?
Improvements are measured not just through achievements in higher quality but also ease of use. Creating pipelines that follow a clear standard makes it easy for smaller studios to adopt the technology. Game engines are extremely competitively priced, and we offer special pricing programs for smaller studios, all of which makes the technology more accessible. We are currently working with smaller studios to come up with an Indie Program.

Smaller studios can now accomplish what larger studios are doing with less space, less equipment, and in a shorter time. Inertial motion capture technology affords smaller studios the luxury of capturing anywhere they want without having to rent out a stage.

Any further examples of how the technology is being used in other forms of entertainment?
A very exciting project that I would like to mention is a show that brings together the latest in technology and one of the oldest performing arts. William Shakespeare’s The Tempest will be performed by the Royal Shakespeare Company this November in Shakespeare’s hometown, Stratford-upon-Avon.

The play is filled with magic and wonder, and thanks to a unique partnership with Intel, the RSC and The Imaginarium, audiences will see The Tempest as Shakespeare himself imagined it, complete with magical beings capable of impossible actions. The performance uses today’s most advanced technology, including the Xsens MVN motion capture system, in a bold reimagining of Shakespeare’s magical play, creating an unforgettable theatrical experience.

Motion capture is moving beyond games and movies – or you could say it has circled back to the origin of performing arts.

About MCV Staff

Check Also

The shortlist for the 2024 MCV/DEVELOP Awards!

After carefully considering the many hundreds of nominations, we have a shortlist! Voting on the winners will begin soon, ahead of the awards ceremony on June 20th