ChatPaper.aiChatPaper

MeshSplatting:基于不透明网格的可微分渲染

MeshSplatting: Differentiable Rendering with Opaque Meshes

December 7, 2025
作者: Jan Held, Sanghyun Son, Renaud Vandeghen, Daniel Rebain, Matheus Gadelha, Yi Zhou, Anthony Cioppa, Ming C. Lin, Marc Van Droogenbroeck, Andrea Tagliasacchi
cs.AI

摘要

基于图元的溅射方法(如3D高斯溅射)通过实时渲染技术彻底革新了新视角合成领域。然而,这些基于点的表征仍无法兼容驱动AR/VR和游戏引擎的网格化流程。我们提出MeshSplatting——一种基于网格的重建方法,通过可微分渲染联合优化几何结构与外观表现。该方法通过受限Delaunay三角剖分强制保持连通性,并优化表面一致性,从而创建端到端平滑、视觉高质量的网格,可在实时3D引擎中高效渲染。在Mip-NeRF360数据集上,该方法将基于网格的新视角合成当前最优方法MiLo的PSNR指标提升了0.69 dB,同时训练速度加快2倍、内存占用减少一半,成功弥合了神经渲染与交互式3D图形之间的鸿沟,实现无缝实时场景交互。项目页面详见:https://meshsplatting.github.io/。
English
Primitive-based splatting methods like 3D Gaussian Splatting have revolutionized novel view synthesis with real-time rendering. However, their point-based representations remain incompatible with mesh-based pipelines that power AR/VR and game engines. We present MeshSplatting, a mesh-based reconstruction approach that jointly optimizes geometry and appearance through differentiable rendering. By enforcing connectivity via restricted Delaunay triangulation and refining surface consistency, MeshSplatting creates end-to-end smooth, visually high-quality meshes that render efficiently in real-time 3D engines. On Mip-NeRF360, it boosts PSNR by +0.69 dB over the current state-of-the-art MiLo for mesh-based novel view synthesis, while training 2x faster and using 2x less memory, bridging neural rendering and interactive 3D graphics for seamless real-time scene interaction. The project page is available at https://meshsplatting.github.io/.
PDF102December 17, 2025