Blog

May 30, 2024

Data-driven model generates natural human motions for virtual avatars

Posted by in category: robotics/AI

Humans can innately perform a wide range of movements, as this allows them to best tackle various tasks in their day-to-day life. Automatically reproducing these motions in virtual avatars and 3D animated human-like characters could be highly advantageous for many applications, ranging from metaverse spaces to digital entertainment, AI interfaces and robotics.

Researchers at Max Planck Institute for Intelligent Systems and ETH Zurich recently developed WANDR, a new model that can generate natural human motions for avatars. This model, to be introduced in a paper presented at the Conference on Computer Vision and Pattern Recognition (CVPR 2024) in June, unifies different data sources under a single model to attain more realistic motions in 3D humanoid characters. The paper is also posted to the arXiv preprint server.

“At a high-level, our research aims at figuring out what it takes to create able to behave like us,” Markos Diomataris, first author of the paper, told Tech Xplore. “This essentially means learning to reason about the world, how to move in it, setting goals and trying to achieve them.

Leave a reply