New System Creates Real-time Performance Capture of Challenging Scenes

Thursday, August 11, 2016

New System Creates Real-time Performance Capture of Challenging Scenes


3D Scanning

Microsoft is developing new real-time 3D scanning capabilities that could mean you could attend a concert or sporting event live in full 3D, or even have the ability to communicate in real-time with remotely captured people using immersive augmented reality or virtual reality displays in the future.


Researchers at Microsoft have created a system that could be the prototype for a next-generation Kinect camera. Called Fusion4D, the project, the scanning system impressively reconstructs complex 3D scenes digitally, including those with more than one person, animals and can even capture clothing being put on the actor.

The researchers have detailed their work in a paper published online.

Fusion4D is the first real-time multi-view non-rigid reconstruction system for live performance capture, claim the researchers. "We have contributed a new pipeline for live multi-view performance capture, generating high-quality reconstructions in real-time, with several unique capabilities over prior work," they conclude.

Fusion4D

Related articles

Today, most cameras and 3D scanners like the Kinect Sensor, used for motion capture still focus on static, non-moving, scenes. This is due to limitations in computational power and the demands on software to reconstruct scenes. 

For more complex scenes, with moving cameras and many elements, the computer must solve for orders of magnitude more parameters in real-time. This typically results in noisy or missing data, choppy motion and digital artifacts in the output that are not representative of what is being captured in the real world.

Fusion4D Microsoft research


"Our reconstruction algorithm enables both incremental reconstruction, improving the surface estimation over time, as well as parameterizing the nonrigid scene motion."
Microsoft's research team also dealt with the case of changing scene topology, such as person removing a jacket or scarf.

The implications of the research are vast. For instance, it could lead to new real-time experiences such as the ability to watch a remote concert or sporting event live in full 3D, or even the ability to communicate in real-time with remotely captured people using immersive augmented reality or virtual reality displays.

The applications could also extend to robotics and machine vision.

With Microsoft's HoloLens system reaching wider deployment now, this last case could lead to some very interesting possibilities.

New System Creates Real-time Performance Capture of Challenging Scenes

"As shown, our reconstruction algorithm enables both incremental reconstruction, improving the surface estimation over time, as well as parameterizing the nonrigid scene motion," write the authors."We also demonstrated how our approach robustly handles both large frame-to-frame motion and topology changes. This was achieved using a novel real-time solver, correspondence algorithm, and fusion method."

"We believe our work can enable new types of live performance capture experiences, such as broadcasting live events including sports and concerts in 3D, and also the ability to capture humans live and have them re-rendered in other geographic locations to enable high fidelity immersive telepresence."




SOURCE  Microsoft Research


By 33rd SquareEmbed


0 comments:

Post a Comment