ChatPaper.aiChatPaper

FMGS:基于基础模型的嵌入式3D高斯飞溅,用于整体3D场景理解

FMGS: Foundation Model Embedded 3D Gaussian Splatting for Holistic 3D Scene Understanding

January 3, 2024
作者: Xingxing Zuo, Pouya Samangouei, Yunwen Zhou, Yan Di, Mingyang Li
cs.AI

摘要

准确感知现实世界三维物体的几何和语义属性对增强现实和机器人应用的持续发展至关重要。为此,我们提出了(),将基础模型的视觉-语言嵌入融入到三维高斯光栅化(GS)中。本研究的关键贡献在于提出了一种高效的方法来重建和表示三维视觉-语言模型。这是通过将基于图像的基础模型生成的特征图提炼到我们的三维模型渲染的特征图中实现的。为了确保高质量的渲染和快速训练,我们引入了一种新颖的场景表示,将GS和多分辨率哈希编码(MHE)的优势结合起来。我们的有效训练过程还引入了像素对齐损失,使相同语义实体的渲染特征距离接近,遵循像素级语义边界。我们的结果展示了显著的多视角语义一致性,有助于多样化的下游任务,在开放词汇语言为基础的目标检测上,击败了现有方法10.2%,尽管我们的推理速度快了851倍。这项研究探讨了视觉、语言和三维场景表示的交叉点,为在不受控制的现实世界环境中增强场景理解铺平了道路。我们计划在论文被接受后发布代码。
English
Precisely perceiving the geometric and semantic properties of real-world 3D objects is crucial for the continued evolution of augmented reality and robotic applications. To this end, we present (), which incorporates vision-language embeddings of foundation models into 3D Gaussian Splatting (GS). The key contribution of this work is an efficient method to reconstruct and represent 3D vision-language models. This is achieved by distilling feature maps generated from image-based foundation models into those rendered from our 3D model. To ensure high-quality rendering and fast training, we introduce a novel scene representation by integrating strengths from both GS and multi-resolution hash encodings (MHE). Our effective training procedure also introduces a pixel alignment loss that makes the rendered feature distance of same semantic entities close, following the pixel-level semantic boundaries. Our results demonstrate remarkable multi-view semantic consistency, facilitating diverse downstream tasks, beating state-of-the-art methods by 10.2 percent on open-vocabulary language-based object detection, despite that we are 851times faster for inference. This research explores the intersection of vision, language, and 3D scene representation, paving the way for enhanced scene understanding in uncontrolled real-world environments. We plan to release the code upon paper acceptance.
PDF81December 15, 2024