MMG: Manipulation-Aware Holistic Human Motion Generation from Sparse Tracking Signals

  • Xuehuai Shi
  • , Renzhi Xiao
  • , Yilun Sheng
  • , Lili Wang
  • , Jian Wu
  • , Xiaobai Chen
  • , Jieming Yin
  • , Qingshan Liu*
  • *Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

Generating realistic avatar motion via sparse tracking signals through VR devices is essential for enhancing the immersive user experience. Human-object manipulation behaviors not only affect hand motion but also significantly impact body motion. However, existing motion generation methods for human-object interactions overlook the coordinated coupling between body and hand motions during manipulations. Due to the diversity and complexity of holistic motion (body and hand motions simultaneously) in the latent motion space, generating physically plausible and temporally consistent holistic motion in real time, via the joint constraints imposed by sparse tracking signals and manipulation content, is a major challenge in the human motion generation task. We propose the manipulation-aware holistic human motion generation method (MMG) to help resolve this issue. In MMG, first, we construct a manipulation-aware holistic human motion generation framework that serially compresses the latent motion space distribution of the body and hand to generate realistic holistic human motion with object manipulation enabled. Second, to enhance the impact of object manipulation on holistic motion generation, MMG designs a novel object manipulation representation to extract effective manipulation features. Third, MMG is trained by an elaborate progressive manipulation-guided training algorithm to improve motion generation robustness and inference performance. Compared to state-of-the-art methods, MMG achieves up to a 39% improvement in the generated holistic motion quality with a 3.55 × speedup in generation performance. In manipulation-enabled scenes, MMG generates holistic motion in real time (≥ 24 fps). Compared to the state-of-the-art methods, its perceived quality is significantly improved, and the task performance of holistic motion-required VR manipulation is high-significantly improved. This paper's code is at https://github.com/XRZ-BUAA/MMG.

Original languageEnglish
Title of host publicationProceedings - 2025 IEEE International Symposium on Mixed and Augmented Reality, ISMAR 2025
EditorsUlrich Eck, Gun Lee, Alexander Plopski, Missie Smith, Qi Sun, Markus Tatzgern
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages197-207
Number of pages11
ISBN (Electronic)9798331587611
DOIs
StatePublished - 2025
Event24th IEEE International Symposium on Mixed and Augmented Reality, ISMAR 2025 - Daejeon, Korea, Republic of
Duration: 8 Oct 202512 Oct 2025

Publication series

NameProceedings - 2025 IEEE International Symposium on Mixed and Augmented Reality, ISMAR 2025

Conference

Conference24th IEEE International Symposium on Mixed and Augmented Reality, ISMAR 2025
Country/TerritoryKorea, Republic of
CityDaejeon
Period8/10/2512/10/25

Keywords

  • Human Motion Generation
  • Manipulation Awareness
  • Real-time Holistic Motion
  • Virtual Reality

Fingerprint

Dive into the research topics of 'MMG: Manipulation-Aware Holistic Human Motion Generation from Sparse Tracking Signals'. Together they form a unique fingerprint.

Cite this