ChatPaper.aiChatPaper

规划:一种面向流式三维重建的松耦合三角-高斯框架

PLANING: A Loosely Coupled Triangle-Gaussian Framework for Streaming 3D Reconstruction

January 29, 2026
作者: Changjian Jiang, Kerui Ren, Xudong Li, Kaiwen Song, Linning Xu, Tao Lu, Junting Dong, Yu Zhang, Bo Dai, Mulin Yu
cs.AI

摘要

基于单目图像序列的在线三维重建仍面临挑战,现有方法通常难以兼顾高质量渲染与精确几何重建。我们提出PLANING框架,该高效在线重建系统基于显式几何基元与神经高斯分布的松散耦合混合表征,实现了几何与外观的解耦建模。这种解耦机制支持分离几何与外观更新的在线初始化及优化策略,在显著降低结构冗余的同时实现稳定的流式重建。PLANING在稠密网格Chamfer-L2指标上较PGSR提升18.52%,PSNR指标超越ARTDECO达1.31 dB,重建ScanNetV2场景仅需不到100秒,比二维高斯泼溅提速5倍以上,且质量媲美离线逐场景优化。除重建质量外,该框架的结构清晰度与计算效率使其特别适用于大规模场景建模、具身AI模拟环境等下游任务。项目页面:https://city-super.github.io/PLANING/。
English
Streaming reconstruction from monocular image sequences remains challenging, as existing methods typically favor either high-quality rendering or accurate geometry, but rarely both. We present PLANING, an efficient on-the-fly reconstruction framework built on a hybrid representation that loosely couples explicit geometric primitives with neural Gaussians, enabling geometry and appearance to be modeled in a decoupled manner. This decoupling supports an online initialization and optimization strategy that separates geometry and appearance updates, yielding stable streaming reconstruction with substantially reduced structural redundancy. PLANING improves dense mesh Chamfer-L2 by 18.52% over PGSR, surpasses ARTDECO by 1.31 dB PSNR, and reconstructs ScanNetV2 scenes in under 100 seconds, over 5x faster than 2D Gaussian Splatting, while matching the quality of offline per-scene optimization. Beyond reconstruction quality, the structural clarity and computational efficiency of \modelname~make it well suited for a broad range of downstream applications, such as enabling large-scale scene modeling and simulation-ready environments for embodied AI. Project page: https://city-super.github.io/PLANING/ .
PDF203January 31, 2026