ChatPaper.aiChatPaper

FMA-Net++:运动与曝光感知的实境联合视频超分辨率与去模糊技术

FMA-Net++: Motion- and Exposure-Aware Real-World Joint Video Super-Resolution and Deblurring

December 4, 2025
作者: Geunhyuk Youk, Jihyong Oh, Munchurl Kim
cs.AI

摘要

现实世界视频复原长期受困于运动与动态曝光变化交织形成的复杂退化问题——这一关键挑战在先前研究中多被忽视,而它正是自动曝光或低光拍摄中的常见伪影。我们提出FMA-Net++框架,通过显式建模运动与动态曝光的耦合效应,实现视频超分辨率与去模糊的联合处理。该框架采用基于双向传播分层细化模块的序列级架构,支持并行化长程时序建模。每个模块内部设有曝光时间感知调制层,根据逐帧曝光参数对特征进行条件化处理,进而驱动曝光感知的流引导动态滤波模块推断运动与曝光感知的退化核。FMA-Net++创新性地将退化学习与复原任务解耦:前者预测曝光-运动感知先验来指导后者,在提升精度同时兼顾效率。为在真实拍摄条件下进行评估,我们建立了REDS-ME(多曝光)和REDS-RE(随机曝光)基准数据集。仅通过合成数据训练,FMA-Net++就在新基准测试集和GoPro数据集上实现了最优的复原精度与时序一致性,其复原质量与推理速度均超越现有方法,并能有效泛化至具有挑战性的真实场景视频。
English
Real-world video restoration is plagued by complex degradations from motion coupled with dynamically varying exposure - a key challenge largely overlooked by prior works and a common artifact of auto-exposure or low-light capture. We present FMA-Net++, a framework for joint video super-resolution and deblurring that explicitly models this coupled effect of motion and dynamically varying exposure. FMA-Net++ adopts a sequence-level architecture built from Hierarchical Refinement with Bidirectional Propagation blocks, enabling parallel, long-range temporal modeling. Within each block, an Exposure Time-aware Modulation layer conditions features on per-frame exposure, which in turn drives an exposure-aware Flow-Guided Dynamic Filtering module to infer motion- and exposure-aware degradation kernels. FMA-Net++ decouples degradation learning from restoration: the former predicts exposure- and motion-aware priors to guide the latter, improving both accuracy and efficiency. To evaluate under realistic capture conditions, we introduce REDS-ME (multi-exposure) and REDS-RE (random-exposure) benchmarks. Trained solely on synthetic data, FMA-Net++ achieves state-of-the-art accuracy and temporal consistency on our new benchmarks and GoPro, outperforming recent methods in both restoration quality and inference speed, and generalizes well to challenging real-world videos.
PDF42December 6, 2025