一种基于最优传输的在线增量学习潜空间培育方法
An Optimal Transport-driven Approach for Cultivating Latent Space in Online Incremental Learning
April 16, 2026
作者: Quyen Tran, Hai Nguyen, Hoang Phan, Quan Dao, Linh Ngo, Khoat Than, Dinh Phung, Dimitris Metaxas, Trung Le
cs.AI
摘要
在線增量學習中,數據持續到達且存在顯著的分佈偏移,這帶來了重大挑戰——因為在學習新任務時,先前樣本的重放價值有限。現有研究通常依賴單個自適應質心或多個固定質心來表徵潛在空間中的每個類別,但當類別數據流本質上具有多模態特性且需要持續更新質心時,這類方法便顯得力不從心。為解決此問題,我們基於最優傳輸理論提出了在線混合模型學習框架(MMOT),使質心能夠隨新數據增量演進。該方法具有兩大優勢:(i)能更精準刻畫複雜數據流特性;(ii)通過MMOT推導的質心,在推理階段可對未見樣本實現更優的類別相似度估計。此外,為增強表徵學習並緩解災難性遺忘,我們設計了動態保持策略,該策略能調控潛在空間並隨時間推移維持類別可分離性。在基準數據集上的實驗驗證表明,我們提出的方法具有卓越的有效性。
English
In online incremental learning, data continuously arrives with substantial distributional shifts, creating a significant challenge because previous samples have limited replay value when learning a new task. Prior research has typically relied on either a single adaptive centroid or multiple fixed centroids to represent each class in the latent space. However, such methods struggle when class data streams are inherently multimodal and require continual centroid updates. To overcome this, we introduce an online Mixture Model learning framework grounded in Optimal Transport theory (MMOT), where centroids evolve incrementally with new data. This approach offers two main advantages: (i) it provides a more precise characterization of complex data streams, and (ii) it enables improved class similarity estimation for unseen samples during inference through MMOT-derived centroids. Furthermore, to strengthen representation learning and mitigate catastrophic forgetting, we design a Dynamic Preservation strategy that regulates the latent space and maintains class separability over time. Experimental evaluations on benchmark datasets confirm the superior effectiveness of our proposed method.