Mono4DGS-HDR:基於交替曝光單目視頻的高動態範圍四維高斯噴射重建
Mono4DGS-HDR: High Dynamic Range 4D Gaussian Splatting from Alternating-exposure Monocular Videos
October 21, 2025
作者: Jinfeng Liu, Lingtong Kong, Mi Zhou, Jinwen Chen, Dan Xu
cs.AI
摘要
我們介紹了Mono4DGS-HDR,這是首個從交替曝光拍攝的未定位單目低動態範圍(LDR)視頻中重建可渲染四維高動態範圍(HDR)場景的系統。為應對這一挑戰性問題,我們提出了一個基於高斯潑濺的兩階段優化統一框架。第一階段在正交相機座標空間中學習視頻HDR高斯表示,無需相機姿態即可實現穩健的初始HDR視頻重建。第二階段將視頻高斯轉換至世界空間,並與相機姿態聯合精煉世界高斯。此外,我們提出了一種時間亮度正則化策略,以增強HDR外觀的時間一致性。鑑於此任務此前未被研究,我們利用公開可用的數據集構建了一個新的HDR視頻重建評估基準。大量實驗表明,Mono4DGS-HDR在渲染質量和速度上均顯著優於從現有最先進方法改編的替代方案。
English
We introduce Mono4DGS-HDR, the first system for reconstructing renderable 4D
high dynamic range (HDR) scenes from unposed monocular low dynamic range (LDR)
videos captured with alternating exposures. To tackle such a challenging
problem, we present a unified framework with two-stage optimization approach
based on Gaussian Splatting. The first stage learns a video HDR Gaussian
representation in orthographic camera coordinate space, eliminating the need
for camera poses and enabling robust initial HDR video reconstruction. The
second stage transforms video Gaussians into world space and jointly refines
the world Gaussians with camera poses. Furthermore, we propose a temporal
luminance regularization strategy to enhance the temporal consistency of the
HDR appearance. Since our task has not been studied before, we construct a new
evaluation benchmark using publicly available datasets for HDR video
reconstruction. Extensive experiments demonstrate that Mono4DGS-HDR
significantly outperforms alternative solutions adapted from state-of-the-art
methods in both rendering quality and speed.