At SIGGRAPH 2017 this week VFX reporter Mike Seymour has been interviewing researchers, veterans and technology companies inside a virtual reality environment created using Epic’s Unreal Engine, which renders a representation of Seymour at 90 frames per second.
To create the real-time VR animations Seymour needs to wear a bulky Technoprops stereo camera rig which monitors every facial movement he makes and converts them into a digital representation using technology created by Cubic Motion.
Specifics about the new virtual reality demonstration include :
• MEETMIKE has about 440,000 triangles being rendered in real time, which means rendering of VR stereo about every 9 milliseconds, of those 75% are used for the hair.
• Mike’s face rig uses about 80 joints, mostly for the movement of the hair and facial hair.
• For the face mesh, there is only about 10 joints used- these are for jaw, eyes and the tongue, in order to add more an arc motion.
• These are in combination with around 750 blendshapes in the final version of the head mesh.
• The system uses complex traditional software design and three deep learning AI engines.
Meet Mike uses the latest techniques in advanced motion capture to drive complex facial rigs to enable detailed interaction in VR. This allows participants to meet, talk in VR and experience new levels of photorealistic interaction. The installation uses new advances in real time rigs, skin shaders, facial capture, deep learning and real-time rendering in UE4.
More details read the Meet Mike paper via the link below.
Latest Geeky Gadgets Deals
Disclosure: Some of our articles include affiliate links. If you buy something through one of these links, Geeky Gadgets may earn an affiliate commission. Learn about our Disclosure Policy.