一种基于最优传输的在线增量学习潜空间构建方法
An Optimal Transport-driven Approach for Cultivating Latent Space in Online Incremental Learning
April 16, 2026
作者: Quyen Tran, Hai Nguyen, Hoang Phan, Quan Dao, Linh Ngo, Khoat Than, Dinh Phung, Dimitris Metaxas, Trung Le
cs.AI
摘要
在线增量学习中,数据持续到达且存在显著分布偏移,这带来了重大挑战——因为先前样本在学习新任务时重放价值有限。现有研究通常依赖单一自适应质心或多个固定质心在隐空间表征类别,但面对固有多模态特性的类别数据流时,此类方法难以实现质心的持续更新。为此,我们提出基于最优传输理论的在线混合模型学习框架(MMOT),其质心会随新数据增量演化。该方法具备两大优势:(i)能更精确刻画复杂数据流特性;(ii)通过MMOT推导的质心可在推理阶段提升未见样本的类别相似度估计精度。此外,为增强表征学习并缓解灾难性遗忘,我们设计了动态保持策略,该策略能约束隐空间并长期维持类别可分性。在基准数据集上的实验验证了所提方法的卓越效能。
English
In online incremental learning, data continuously arrives with substantial distributional shifts, creating a significant challenge because previous samples have limited replay value when learning a new task. Prior research has typically relied on either a single adaptive centroid or multiple fixed centroids to represent each class in the latent space. However, such methods struggle when class data streams are inherently multimodal and require continual centroid updates. To overcome this, we introduce an online Mixture Model learning framework grounded in Optimal Transport theory (MMOT), where centroids evolve incrementally with new data. This approach offers two main advantages: (i) it provides a more precise characterization of complex data streams, and (ii) it enables improved class similarity estimation for unseen samples during inference through MMOT-derived centroids. Furthermore, to strengthen representation learning and mitigate catastrophic forgetting, we design a Dynamic Preservation strategy that regulates the latent space and maintains class separability over time. Experimental evaluations on benchmark datasets confirm the superior effectiveness of our proposed method.