Interactive art installation combines digital puppetry, motion capture and live action
GUILDFORD UK, 9 JULY 2015 – IKINEMA’s Live Action for Unreal Engine 4 is playing a surprise role at The Metropolitan Museum of Art in New York this month. The Met is staging an interactive digital performance installation to celebrate the restoration and return of the renowned Italian Renaissance sculpture Adam (ca. 1490–95) by Tullio Lombardo. Designed and directed by New Media Artist Reid Farrington, and commissioned by the live arts series Met Museum Presents, The Return blends digital animation with live performance and motion capture to tell the story of the sculpture’s creation, travels and return to the gallery.
Farrington said: “My vision was to bring Adam to life in a believable and genuinely interactive way. By using a motion capture rig and IKinema LiveAction for Unreal Engine 4 to drive the animation in real time, I’ve been able to deliver the level of realism I wanted.”
Animation Design Consultant Athomas Goldberg of Lifelike & Believable designed and built the digital puppetry system, which enables visitors to interact in real time with ‘digital Adam’. Guests can speak directly to the digital character and pose questions, as well as visit the mocap theatre within the Museum for a behind-the-scenes experience.
The Return has been more than two years in development and from the outset the team agreed to the fundamental principle of no pre-recorded material – everything is generated live to ensure each visitor’s experience is unique and engaging. The result is two hours of material spanning 14 scenes with two characters – ‘digital Adam’ and a museum ‘docent’ who leads visitors through the performance. As the performance runs all day, during Museum hours, there are three pairs of performers who have been trained to drive the puppetry system when not performing, enabling them to control the pre-set lighting, audio and effects. The 16-camera OptiTrack system is hooked up to Natural Point's Motive software, which streams the mocap data to IKinema LiveAction for solving and retargeting into Unreal Engine 4.
Goldberg said: “We’re using IKinema LiveAction to drive both the characters and the props. There are other full-body IK solutions out there, but nothing that gave me the flexibility and modularity to create a runtime rig exactly to my specifications, with the ability to easily adapt to each of the actor's unique proportions in a wide variety of rapidly changing environments and situations.”
IKinema Chief Executive Alexandre Pechev said: “With the diversity of applications for motion capture these days we believe there will be many more new ways of using live quality solving in life-like rendering environments. This is one great example and I’m sure IKinema and Epic Games will continue to play a role in this highly creative field. Bringing Adam to life has been extraordinary and ground-breaking work and we’re delighted that LiveAction and Unreal Engine 4 have been able to deliver the level of realism required.”
With IKinema LiveAction, studios can achieve post-production quality solving and retargeting during live motion capture sessions. The rigs are exported from Maya or MotionBuilder and the set-ups are interchangeable between these environments for live and post-production work. Directors can see the final result in live scenarios directly on the production rigs. LiveAction is available for Unreal Engine 4 and Windows. For more information, contact support@IKinema.com.
The Return is part of the 2015-2016 season of Met Museum Presents and runs from July 11 to August 2, 2015.
See more articles
Please, introduce yourself: