ChatPaper.aiChatPaper

全方位氣象:面向氣象生成與理解的統一多模態基礎模型

Omni-Weather: Unified Multimodal Foundation Model for Weather Generation and Understanding

December 25, 2025
作者: Zhiwang Zhou, Yuandong Pu, Xuming He, Yidi Liu, Yixin Chen, Junchao Gong, Xiang Zhuang, Wanghan Xu, Qinglong Cao, Shixiang Tang, Yihao Liu, Wenlong Zhang, Lei Bai
cs.AI

摘要

氣象建模需要兼具精準預測與機理詮釋能力,然而現有方法將這兩大目標割裂處理,使生成與理解相互分離。為彌合這一鴻溝,我們提出首個多模態基礎模型Omni-Weather,通過統一架構實現氣象生成與理解的協同。該模型集成雷達編碼器處理氣象生成任務,並採用共享自注意力機制進行統一運算。此外,我們構建了專用於氣象生成因果推理的思維鏈數據集,使模型既能輸出可詮釋結果,又提升感知質量。大量實驗表明,Omni-Weather在氣象生成與理解任務上均達到最先進水平。研究進一步揭示氣象領域的生成與理解任務具有相互增強效應,驗證了統一氣象生成與理解框架的可行性與價值。
English
Weather modeling requires both accurate prediction and mechanistic interpretation, yet existing methods treat these goals in isolation, separating generation from understanding. To address this gap, we present Omni-Weather, the first multimodal foundation model that unifies weather generation and understanding within a single architecture. Omni-Weather integrates a radar encoder for weather generation tasks, followed by unified processing using a shared self-attention mechanism. Moreover, we construct a Chain-of-Thought dataset for causal reasoning in weather generation, enabling interpretable outputs and improved perceptual quality. Extensive experiments show Omni-Weather achieves state-of-the-art performance in both weather generation and understanding. Our findings further indicate that generative and understanding tasks in the weather domain can mutually enhance each other. Omni-Weather also demonstrates the feasibility and value of unifying weather generation and understanding.
PDF81December 30, 2025