規劃化:面向串流三維重建的鬆散耦合三角-高斯框架
PLANING: A Loosely Coupled Triangle-Gaussian Framework for Streaming 3D Reconstruction
January 29, 2026
作者: Changjian Jiang, Kerui Ren, Xudong Li, Kaiwen Song, Linning Xu, Tao Lu, Junting Dong, Yu Zhang, Bo Dai, Mulin Yu
cs.AI
摘要
基於單目影像序列的即時三維重建技術仍面臨挑戰,現有方法通常難以兼顧高品質渲染與精確幾何結構。我們提出PLANING框架,該高效能即時重建系統基於混合表徵建構,通過鬆散耦合的幾何圖元與神經高斯表徵,實現幾何結構與外觀屬性的解耦建模。這種解耦機制支援線上初始化與優化策略,使幾何更新與外觀更新分離執行,從而實現結構冗餘大幅降低的穩定串流重建。PLANING在稠密網格的Chamfer-L2誤差上較PGSR提升18.52%,PSNR指標超越ARTDECO達1.31 dB,重建ScanNetV2場景耗時不足100秒,速度超越二維高斯潑濺技術5倍以上,且重建品質可媲美離線逐場景優化方法。除重建品質優勢外,\modelname~具備的結構清晰度與計算效率,使其特別適合大規模場景建模、具身智能所需的即用型仿真環境等下游應用。項目頁面:https://city-super.github.io/PLANING/ 。
English
Streaming reconstruction from monocular image sequences remains challenging, as existing methods typically favor either high-quality rendering or accurate geometry, but rarely both. We present PLANING, an efficient on-the-fly reconstruction framework built on a hybrid representation that loosely couples explicit geometric primitives with neural Gaussians, enabling geometry and appearance to be modeled in a decoupled manner. This decoupling supports an online initialization and optimization strategy that separates geometry and appearance updates, yielding stable streaming reconstruction with substantially reduced structural redundancy. PLANING improves dense mesh Chamfer-L2 by 18.52% over PGSR, surpasses ARTDECO by 1.31 dB PSNR, and reconstructs ScanNetV2 scenes in under 100 seconds, over 5x faster than 2D Gaussian Splatting, while matching the quality of offline per-scene optimization. Beyond reconstruction quality, the structural clarity and computational efficiency of \modelname~make it well suited for a broad range of downstream applications, such as enabling large-scale scene modeling and simulation-ready environments for embodied AI. Project page: https://city-super.github.io/PLANING/ .