ChatPaper.aiChatPaper

保留源视频真实感:实现电影级画质的高保真面部替换

Preserving Source Video Realism: High-Fidelity Face Swapping for Cinematic Quality

December 8, 2025
作者: Zekai Luo, Zongze Du, Zhouhang Zhu, Hao Zhong, Muzhi Zhu, Wen Wang, Yuling Xi, Chenchen Jing, Hao Chen, Chunhua Shen
cs.AI

摘要

视频人脸替换技术在影视娱乐制作中具有关键作用,然而在长时序复杂视频中实现高保真度与时间一致性仍是一项重大挑战。受近期参考引导图像编辑进展的启发,我们探索是否能够类似地利用源视频中丰富的视觉属性来增强视频人脸替换的保真度与时间连贯性。基于这一洞见,本研究提出了首个视频参考引导的人脸替换模型LivingSwap。该方法采用关键帧作为条件信号来注入目标身份特征,实现灵活可控的编辑。通过将关键帧条件化与视频参考引导相结合,模型执行时序拼接以确保长视频序列中稳定的身份保持和高保真重建。针对参考引导训练数据稀缺的问题,我们构建了配对人脸替换数据集Face2Face,并通过数据对反转确保可靠的基准真值监督。大量实验表明,我们的方法实现了最先进的效果,将目标身份与源视频的表情、光照和运动无缝融合,同时显著减少了制作流程中的人工干预。项目页面:https://aim-uofa.github.io/LivingSwap
English
Video face swapping is crucial in film and entertainment production, where achieving high fidelity and temporal consistency over long and complex video sequences remains a significant challenge. Inspired by recent advances in reference-guided image editing, we explore whether rich visual attributes from source videos can be similarly leveraged to enhance both fidelity and temporal coherence in video face swapping. Building on this insight, this work presents LivingSwap, the first video reference guided face swapping model. Our approach employs keyframes as conditioning signals to inject the target identity, enabling flexible and controllable editing. By combining keyframe conditioning with video reference guidance, the model performs temporal stitching to ensure stable identity preservation and high-fidelity reconstruction across long video sequences. To address the scarcity of data for reference-guided training, we construct a paired face-swapping dataset, Face2Face, and further reverse the data pairs to ensure reliable ground-truth supervision. Extensive experiments demonstrate that our method achieves state-of-the-art results, seamlessly integrating the target identity with the source video's expressions, lighting, and motion, while significantly reducing manual effort in production workflows. Project webpage: https://aim-uofa.github.io/LivingSwap
PDF401December 11, 2025