DROID-SLAM 在野外环境中的应用
DROID-SLAM in the Wild
March 19, 2026
作者: Moyang Li, Zihan Zhu, Marc Pollefeys, Daniel Barath
cs.AI
摘要
我们提出了一种基于可微分不确定性感知光束法平差的鲁棒实时RGB SLAM系统,能够有效处理动态环境。传统SLAM方法通常假设场景静态,在存在运动物体时会导致跟踪失败。近期动态SLAM方案尝试通过预定义动态先验或不确定性感知建图来解决这一挑战,但在遇到未知动态物体或几何建图不可靠的高度杂乱场景时仍存在局限。与之相反,我们的方法通过利用多视角视觉特征不一致性来估计逐像素不确定性,即使在真实世界动态环境中也能实现鲁棒的跟踪与重建。该系系统在杂乱动态场景中实现了最先进的相机位姿估计与场景几何重建效果,同时以约10帧/秒的速度实时运行。代码与数据集详见https://github.com/MoyangLi00/DROID-W.git。
English
We present a robust, real-time RGB SLAM system that handles dynamic environments by leveraging differentiable Uncertainty-aware Bundle Adjustment. Traditional SLAM methods typically assume static scenes, leading to tracking failures in the presence of motion. Recent dynamic SLAM approaches attempt to address this challenge using predefined dynamic priors or uncertainty-aware mapping, but they remain limited when confronted with unknown dynamic objects or highly cluttered scenes where geometric mapping becomes unreliable. In contrast, our method estimates per-pixel uncertainty by exploiting multi-view visual feature inconsistency, enabling robust tracking and reconstruction even in real-world environments. The proposed system achieves state-of-the-art camera poses and scene geometry in cluttered dynamic scenarios while running in real time at around 10 FPS. Code and datasets are available at https://github.com/MoyangLi00/DROID-W.git.