ChatPaper.aiChatPaper

Motion2Motion:基于稀疏对应的跨拓扑运动迁移

Motion2Motion: Cross-topology Motion Transfer with Sparse Correspondence

August 18, 2025
作者: Ling-Hao Chen, Yuhong Zhang, Zixin Yin, Zhiyang Dou, Xin Chen, Jingbo Wang, Taku Komura, Lei Zhang
cs.AI

摘要

本研究探讨了在骨骼拓扑结构差异显著的字符间进行动画迁移的挑战。尽管数十年来许多技术已推动了动作重定向的发展,但在不同拓扑结构间的动作迁移仍较少被探索。主要障碍在于源骨骼与目标骨骼之间固有的拓扑不一致性,这限制了一对一骨骼对应关系的直接建立。此外,当前缺乏跨越不同拓扑结构的大规模配对动作数据集,严重制约了数据驱动方法的发展。为应对这些局限,我们提出了Motion2Motion,一种新颖的无需训练框架。Motion2Motion简洁而高效,仅需目标骨骼上的一个或少量示例动作,通过访问源骨骼与目标骨骼间稀疏的骨骼对应关系即可工作。通过全面的定性与定量评估,我们展示了Motion2Motion在相似骨骼及跨物种骨骼迁移场景中均实现了高效可靠的表现。该方法在下游应用及用户界面中的成功集成,进一步证明了其实际应用价值,凸显了其在工业应用中的潜力。代码与数据可在https://lhchen.top/Motion2Motion获取。
English
This work studies the challenge of transfer animations between characters whose skeletal topologies differ substantially. While many techniques have advanced retargeting techniques in decades, transfer motions across diverse topologies remains less-explored. The primary obstacle lies in the inherent topological inconsistency between source and target skeletons, which restricts the establishment of straightforward one-to-one bone correspondences. Besides, the current lack of large-scale paired motion datasets spanning different topological structures severely constrains the development of data-driven approaches. To address these limitations, we introduce Motion2Motion, a novel, training-free framework. Simply yet effectively, Motion2Motion works with only one or a few example motions on the target skeleton, by accessing a sparse set of bone correspondences between the source and target skeletons. Through comprehensive qualitative and quantitative evaluations, we demonstrate that Motion2Motion achieves efficient and reliable performance in both similar-skeleton and cross-species skeleton transfer scenarios. The practical utility of our approach is further evidenced by its successful integration in downstream applications and user interfaces, highlighting its potential for industrial applications. Code and data are available at https://lhchen.top/Motion2Motion.
PDF21August 20, 2025