ChatPaper.aiChatPaper

ArtHOI:驯服基础模型实现手部-物体交互的单目四维重建

ArtHOI: Taming Foundation Models for Monocular 4D Reconstruction of Hand-Articulated-Object Interactions

March 26, 2026
作者: Zikai Wang, Zhilu Zhang, Yiqing Wang, Hui Li, Wangmeng Zuo
cs.AI

摘要

现有手物交互方法主要局限于刚性物体,而铰接物体的四维重建通常需要预先扫描物体甚至使用多视角视频。从单目RGB视频重建人类-铰接物体交互的四维动态,仍是一个尚未解决但极具意义的挑战。幸运的是,基础模型的突破为解决这一高度不适定问题提供了新机遇。为此,我们提出ArtHOI——一个基于优化的框架,通过整合并精炼多源基础模型的先验知识,创新性地设计了一套解决这些先验固有误差与物理失真的方法。具体而言,我们提出自适应采样优化方法,通过优化物体的度量尺度和位姿,实现其归一化网格在世界空间中的准确定位;同时开发出多模态大语言模型引导的手物对齐方法,利用接触推理信息作为手物网格组合优化的约束条件。为进行全面评估,我们还贡献了两个新数据集ArtHOI-RGBD和ArtHOI-Wild。大量实验验证了ArtHOI在不同物体与交互场景中的鲁棒性和有效性。项目地址:https://arthoi-reconstruction.github.io。
English
Existing hand-object interactions (HOI) methods are largely limited to rigid objects, while 4D reconstruction methods of articulated objects generally require pre-scanning the object or even multi-view videos. It remains an unexplored but significant challenge to reconstruct 4D human-articulated-object interactions from a single monocular RGB video. Fortunately, recent advancements in foundation models present a new opportunity to address this highly ill-posed problem. To this end, we introduce ArtHOI, an optimization-based framework that integrates and refines priors from multiple foundation models. Our key contribution is a suite of novel methodologies designed to resolve the inherent inaccuracies and physical unreality of these priors. In particular, we introduce an Adaptive Sampling Refinement (ASR) method to optimize object's metric scale and pose for grounding its normalized mesh in world space. Furthermore, we propose a Multimodal Large Language Model (MLLM) guided hand-object alignment method, utilizing contact reasoning information as constraints of hand-object mesh composition optimization. To facilitate a comprehensive evaluation, we also contribute two new datasets, ArtHOI-RGBD and ArtHOI-Wild. Extensive experiments validate the robustness and effectiveness of our ArtHOI across diverse objects and interactions. Project: https://arthoi-reconstruction.github.io.
PDF31April 2, 2026