One of the animation industry’s most disruptive companies came to Edinburgh today to offer a glimpse into a future where player and game character bind together as one.
Image Metrics, an American-owned British animation firm, is internally building a camera-based program that can recognise even the most nuanced human facial movements. This data is converted into code and controls an on-screen 3D character in real-time.
Mike Rogers, director of research at the company, delighted delegates at the Edinburgh Interactive festival with a demonstration of the tech.
Rogers talked into his laptop camera that displayed the view onto a large projection screen. Then, after activating the new animation program, Rogers continued to talk, but the projector was no longer showing his face. Instead it was displaying a large cartoon panda.
As Rogers smiled, so did the Panda. A raise of the eyebrow, or slight grimace, was replicated in real-time.
The cartoon panda said; “We’ve got very good at even the most nuanced expressions with game characters.”
It added: “But that’s not as far as we can go, in terms of building a more direct interaction between the game and its customers.”
The technology, internally dubbed ‘Live Driver’, is not yet a commercial product. The panda said Image Metrics would be making an announcement soon.
Suddenly, at the flick of a switch, the panda was no more. Rogers began addressing the crowd as a thick-necked mercenary who wouldn’t exactly stand out in a Gears of War game.
“Migrating facial animation technology to the consumer still has many challenges,” the grunt said.
“But today’s generation of peripherals have given us the biggest opportunity for natural interaction that we’ve ever had. Wii Remote and Move, for example, show that motion interaction has already become pretty established.
The grunt said Image Metric’s new technology “opens a whole new world of opportunities for application and game makers.”
The crowd of developers laughed and cheered. The grunt smiled back.