ChatPaper.aiChatPaper

NAVI:具有高质量3D形状和姿势标注的类别无关图像集合

NAVI: Category-Agnostic Image Collections with High-Quality 3D Shape and Pose Annotations

June 15, 2023
作者: Varun Jampani, Kevis-Kokitsi Maninis, Andreas Engelhardt, Arjun Karpur, Karen Truong, Kyle Sargent, Stefan Popov, André Araujo, Ricardo Martin-Brualla, Kaushal Patel, Daniel Vlasic, Vittorio Ferrari, Ameesh Makadia, Ce Liu, Yuanzhen Li, Howard Zhou
cs.AI

摘要

最近神经重建技术的进展实现了从随意拍摄的图像集合中进行高质量的三维物体重建。当前的技术主要在相对简单的图像集合上分析其进展,这些集合中结构从运动(SfM)技术可以提供地面真实(GT)相机姿势。我们注意到,SfM技术在野外图像集合上往往会失败,比如具有不同背景和光照的图像搜索结果。为了促进从随意拍摄的图像中进行三维重建的系统研究进展,我们提出了NAVI:一个新的类别不可知的物体图像集合数据集,其中包含高质量的三维扫描,以及每个图像的二维至三维对齐,提供接近完美的GT相机参数。这些二维至三维对齐使我们能够提取准确的导数注释,如密集像素对应、深度和分割图。我们展示了在不同问题设置上使用NAVI图像集合,并表明NAVI使得进行更全面的评估成为可能,而这在现有数据集上是不可能的。我们相信NAVI对于三维重建和对应估计的系统研究进展是有益的。项目页面:https://navidataset.github.io
English
Recent advances in neural reconstruction enable high-quality 3D object reconstruction from casually captured image collections. Current techniques mostly analyze their progress on relatively simple image collections where Structure-from-Motion (SfM) techniques can provide ground-truth (GT) camera poses. We note that SfM techniques tend to fail on in-the-wild image collections such as image search results with varying backgrounds and illuminations. To enable systematic research progress on 3D reconstruction from casual image captures, we propose NAVI: a new dataset of category-agnostic image collections of objects with high-quality 3D scans along with per-image 2D-3D alignments providing near-perfect GT camera parameters. These 2D-3D alignments allow us to extract accurate derivative annotations such as dense pixel correspondences, depth and segmentation maps. We demonstrate the use of NAVI image collections on different problem settings and show that NAVI enables more thorough evaluations that were not possible with existing datasets. We believe NAVI is beneficial for systematic research progress on 3D reconstruction and correspondence estimation. Project page: https://navidataset.github.io
PDF40December 15, 2024