ChatPaper.aiChatPaper

SceNeRFlow:一致時間重建一般動態場景

SceNeRFlow: Time-Consistent Reconstruction of General Dynamic Scenes

August 16, 2023
作者: Edith Tretschk, Vladislav Golyanik, Michael Zollhoefer, Aljaz Bozic, Christoph Lassner, Christian Theobalt
cs.AI

摘要

現有的方法用於對一般非剛性變形物體進行4D重建,著重於新視角合成,忽略對應關係。然而,時間一致性能夠實現諸如3D編輯、運動分析或虛擬資產創建等高級下游任務。我們提出SceNeRFlow來以一致的時間方式重建一般非剛性場景。我們的動態NeRF方法使用多視角RGB視頻和靜態攝像機拍攝的背景圖像作為輸入,然後以在線方式重建幾何和外觀的預估標準模型的變形。由於這個標準模型是時間不變的,我們即使對於長期、長距離運動也能獲得對應關係。我們使用神經場景表示來參數化我們方法的組件。與先前的動態NeRF方法一樣,我們使用反向變形模型。我們發現這個模型的非平凡適應對處理較大運動是必要的:我們將變形分解為強正則化的粗糙組件和弱正則化的細緻組件,其中粗糙組件還將變形場擴展到物體周圍的空間,從而實現隨時間的跟踪。我們實驗性地展示,與僅處理小運動的先前工作不同,我們的方法實現了對工作室規模運動的重建。
English
Existing methods for the 4D reconstruction of general, non-rigidly deforming objects focus on novel-view synthesis and neglect correspondences. However, time consistency enables advanced downstream tasks like 3D editing, motion analysis, or virtual-asset creation. We propose SceNeRFlow to reconstruct a general, non-rigid scene in a time-consistent manner. Our dynamic-NeRF method takes multi-view RGB videos and background images from static cameras with known camera parameters as input. It then reconstructs the deformations of an estimated canonical model of the geometry and appearance in an online fashion. Since this canonical model is time-invariant, we obtain correspondences even for long-term, long-range motions. We employ neural scene representations to parametrize the components of our method. Like prior dynamic-NeRF methods, we use a backwards deformation model. We find non-trivial adaptations of this model necessary to handle larger motions: We decompose the deformations into a strongly regularized coarse component and a weakly regularized fine component, where the coarse component also extends the deformation field into the space surrounding the object, which enables tracking over time. We show experimentally that, unlike prior work that only handles small motion, our method enables the reconstruction of studio-scale motions.

Summary

AI-Generated Summary

PDF50December 15, 2024