TY - GEN
T1 - MMG
T2 - 24th IEEE International Symposium on Mixed and Augmented Reality, ISMAR 2025
AU - Shi, Xuehuai
AU - Xiao, Renzhi
AU - Sheng, Yilun
AU - Wang, Lili
AU - Wu, Jian
AU - Chen, Xiaobai
AU - Yin, Jieming
AU - Liu, Qingshan
N1 - Publisher Copyright:
© 2025 IEEE.
PY - 2025
Y1 - 2025
N2 - Generating realistic avatar motion via sparse tracking signals through VR devices is essential for enhancing the immersive user experience. Human-object manipulation behaviors not only affect hand motion but also significantly impact body motion. However, existing motion generation methods for human-object interactions overlook the coordinated coupling between body and hand motions during manipulations. Due to the diversity and complexity of holistic motion (body and hand motions simultaneously) in the latent motion space, generating physically plausible and temporally consistent holistic motion in real time, via the joint constraints imposed by sparse tracking signals and manipulation content, is a major challenge in the human motion generation task. We propose the manipulation-aware holistic human motion generation method (MMG) to help resolve this issue. In MMG, first, we construct a manipulation-aware holistic human motion generation framework that serially compresses the latent motion space distribution of the body and hand to generate realistic holistic human motion with object manipulation enabled. Second, to enhance the impact of object manipulation on holistic motion generation, MMG designs a novel object manipulation representation to extract effective manipulation features. Third, MMG is trained by an elaborate progressive manipulation-guided training algorithm to improve motion generation robustness and inference performance. Compared to state-of-the-art methods, MMG achieves up to a 39% improvement in the generated holistic motion quality with a 3.55 × speedup in generation performance. In manipulation-enabled scenes, MMG generates holistic motion in real time (≥ 24 fps). Compared to the state-of-the-art methods, its perceived quality is significantly improved, and the task performance of holistic motion-required VR manipulation is high-significantly improved. This paper's code is at https://github.com/XRZ-BUAA/MMG.
AB - Generating realistic avatar motion via sparse tracking signals through VR devices is essential for enhancing the immersive user experience. Human-object manipulation behaviors not only affect hand motion but also significantly impact body motion. However, existing motion generation methods for human-object interactions overlook the coordinated coupling between body and hand motions during manipulations. Due to the diversity and complexity of holistic motion (body and hand motions simultaneously) in the latent motion space, generating physically plausible and temporally consistent holistic motion in real time, via the joint constraints imposed by sparse tracking signals and manipulation content, is a major challenge in the human motion generation task. We propose the manipulation-aware holistic human motion generation method (MMG) to help resolve this issue. In MMG, first, we construct a manipulation-aware holistic human motion generation framework that serially compresses the latent motion space distribution of the body and hand to generate realistic holistic human motion with object manipulation enabled. Second, to enhance the impact of object manipulation on holistic motion generation, MMG designs a novel object manipulation representation to extract effective manipulation features. Third, MMG is trained by an elaborate progressive manipulation-guided training algorithm to improve motion generation robustness and inference performance. Compared to state-of-the-art methods, MMG achieves up to a 39% improvement in the generated holistic motion quality with a 3.55 × speedup in generation performance. In manipulation-enabled scenes, MMG generates holistic motion in real time (≥ 24 fps). Compared to the state-of-the-art methods, its perceived quality is significantly improved, and the task performance of holistic motion-required VR manipulation is high-significantly improved. This paper's code is at https://github.com/XRZ-BUAA/MMG.
KW - Human Motion Generation
KW - Manipulation Awareness
KW - Real-time Holistic Motion
KW - Virtual Reality
UR - https://www.scopus.com/pages/publications/105025057839
U2 - 10.1109/ISMAR67309.2025.00032
DO - 10.1109/ISMAR67309.2025.00032
M3 - 会议稿件
AN - SCOPUS:105025057839
T3 - Proceedings - 2025 IEEE International Symposium on Mixed and Augmented Reality, ISMAR 2025
SP - 197
EP - 207
BT - Proceedings - 2025 IEEE International Symposium on Mixed and Augmented Reality, ISMAR 2025
A2 - Eck, Ulrich
A2 - Lee, Gun
A2 - Plopski, Alexander
A2 - Smith, Missie
A2 - Sun, Qi
A2 - Tatzgern, Markus
PB - Institute of Electrical and Electronics Engineers Inc.
Y2 - 8 October 2025 through 12 October 2025
ER -