ChatPaper.aiChatPaper

Freditor:基于频率分解的高保真与可迁移NeRF编辑

Freditor: High-Fidelity and Transferable NeRF Editing by Frequency Decomposition

April 3, 2024
作者: Yisheng He, Weihao Yuan, Siyu Zhu, Zilong Dong, Liefeng Bo, Qixing Huang
cs.AI

摘要

本文通过频率分解实现了高保真、可迁移的NeRF编辑。最近的NeRF编辑流程虽能将2D风格化结果提升至3D场景,却常因模糊效果而受限,且难以捕捉由2D编辑不一致导致的细节结构。我们的关键洞察在于,图像的低频成分在编辑后相较于高频部分更具多视角一致性。此外,外观风格主要体现在低频成分上,而内容细节尤其集中于高频部分。这促使我们在低频成分上进行编辑,从而获得高保真度的编辑场景。同时,编辑操作在低频特征空间中进行,实现了稳定的强度控制和新场景的迁移。在逼真数据集上的全面实验表明,高保真和可迁移的NeRF编辑具有卓越性能。项目页面位于https://aigc3d.github.io/freditor。
English
This paper enables high-fidelity, transferable NeRF editing by frequency decomposition. Recent NeRF editing pipelines lift 2D stylization results to 3D scenes while suffering from blurry results, and fail to capture detailed structures caused by the inconsistency between 2D editings. Our critical insight is that low-frequency components of images are more multiview-consistent after editing compared with their high-frequency parts. Moreover, the appearance style is mainly exhibited on the low-frequency components, and the content details especially reside in high-frequency parts. This motivates us to perform editing on low-frequency components, which results in high-fidelity edited scenes. In addition, the editing is performed in the low-frequency feature space, enabling stable intensity control and novel scene transfer. Comprehensive experiments conducted on photorealistic datasets demonstrate the superior performance of high-fidelity and transferable NeRF editing. The project page is at https://aigc3d.github.io/freditor.

Summary

AI-Generated Summary

PDF110November 26, 2024