BEVCALIB:基于几何引导鸟瞰图表示的LiDAR-相机标定
BEVCALIB: LiDAR-Camera Calibration via Geometry-Guided Bird's-Eye View Representations
June 3, 2025
作者: Weiduo Yuan, Jerry Li, Justin Yue, Divyank Shah, Konstantinos Karydis, Hang Qiu
cs.AI
摘要
精确的LiDAR-相机校正是实现自动驾驶与机器人系统中多模态感知融合的基础。传统校正方法需在受控环境下进行大量数据采集,且无法补偿车辆/机器人运动过程中的变换变化。本文首次提出了一种利用鸟瞰图(BEV)特征从原始数据中完成LiDAR相机校正的模型,命名为BEVCALIB。为此,我们分别提取相机BEV特征与LiDAR BEV特征,并将它们融合至共享的BEV特征空间。为充分利用BEV特征中的几何信息,我们引入了一种新颖的特征选择器,用于在变换解码器中筛选最关键的特征,从而减少内存消耗并实现高效训练。在KITTI、NuScenes及我们自建数据集上的广泛评估表明,BEVCALIB确立了新的技术标杆。在各种噪声条件下,BEVCALIB在KITTI数据集上以(47.08%,82.32%)的平均优势,在NuScenes数据集上以(78.17%,68.29%)的平均优势,分别在(平移,旋转)方面超越了文献中的最佳基线。在开源领域,它将最佳可复现基线的性能提升了一个数量级。我们的代码与演示结果可访问https://cisl.ucr.edu/BEVCalib获取。
English
Accurate LiDAR-camera calibration is fundamental to fusing multi-modal
perception in autonomous driving and robotic systems. Traditional calibration
methods require extensive data collection in controlled environments and cannot
compensate for the transformation changes during the vehicle/robot movement. In
this paper, we propose the first model that uses bird's-eye view (BEV) features
to perform LiDAR camera calibration from raw data, termed BEVCALIB. To achieve
this, we extract camera BEV features and LiDAR BEV features separately and fuse
them into a shared BEV feature space. To fully utilize the geometric
information from the BEV feature, we introduce a novel feature selector to
filter the most important features in the transformation decoder, which reduces
memory consumption and enables efficient training. Extensive evaluations on
KITTI, NuScenes, and our own dataset demonstrate that BEVCALIB establishes a
new state of the art. Under various noise conditions, BEVCALIB outperforms the
best baseline in the literature by an average of (47.08%, 82.32%) on KITTI
dataset, and (78.17%, 68.29%) on NuScenes dataset, in terms of (translation,
rotation), respectively. In the open-source domain, it improves the best
reproducible baseline by one order of magnitude. Our code and demo results are
available at https://cisl.ucr.edu/BEVCalib.