Perceiving Systems, Computer Vision
Body Talk: A New Crowdshaping Technology Uses Words  to Create Accurate 3D Body Models

Body Talk: A New Crowdshaping Technology Uses Words to Create Accurate 3D Body Models

A breakthrough in our shared understanding, perception, and description of human body shape brings new alternatives to 3D body scanning

ANAHEIM, CALIFORNIA -- JULY 26, 2016 -- Researchers from the Max Planck Institute for Intelligent Systems and the University of Texas at Dallas, revealed new crowdshaping technology at SIGGRAPH 2016 that creates accurate 3D body models from 2D photos using crowdsourced linguistic descriptions of body shape. The Body Talk system takes a single photo and produces 3D body shapes that look like the person and are accurate enough to size clothing. It does this using the help of 15 volunteers who rate the body shape in the photo using 30 words or fewer. The researchers believe this technology has applications in online shopping, gaming, virtual reality and healthcare.


People

ps Michael Black
Michael Black
Managing Director
ps Stephan Streuber
Stephan Streuber
Affiliated Researcher
Alumni
ps Maria Alejandra Quiros-Ramirez
ps Silvia Zuffi
Silvia Zuffi
Guest Scientist