ChatPaper.aiChatPaper

SCas4D:結構級聯優化技術提升持久性四維新視角合成

SCas4D: Structural Cascaded Optimization for Boosting Persistent 4D Novel View Synthesis

October 8, 2025
作者: Jipeng Lyu, Jiahua Dong, Yu-Xiong Wang
cs.AI

摘要

持久動態場景建模在追蹤與新視角合成方面仍具挑戰性,原因在於難以在保持計算效率的同時捕捉精確的形變。我們提出了SCas4D,這是一種利用三維高斯潑濺中結構模式進行動態場景建模的級聯優化框架。其核心思想在於現實世界中的形變常呈現出層次化模式,即高斯群組共享相似的變換。通過從粗粒度的部件級到細粒度的點級逐步精煉形變,SCas4D在每個時間幀內僅需100次迭代即可收斂,並在僅需現有方法二十分之一的訓練迭代次數下,產出與之相當的結果。該方法在自監督關節物體分割、新視角合成及密集點追蹤任務中也展現了其有效性。
English
Persistent dynamic scene modeling for tracking and novel-view synthesis remains challenging due to the difficulty of capturing accurate deformations while maintaining computational efficiency. We propose SCas4D, a cascaded optimization framework that leverages structural patterns in 3D Gaussian Splatting for dynamic scenes. The key idea is that real-world deformations often exhibit hierarchical patterns, where groups of Gaussians share similar transformations. By progressively refining deformations from coarse part-level to fine point-level, SCas4D achieves convergence within 100 iterations per time frame and produces results comparable to existing methods with only one-twentieth of the training iterations. The approach also demonstrates effectiveness in self-supervised articulated object segmentation, novel view synthesis, and dense point tracking tasks.
PDF22October 17, 2025