ChatPaper.aiChatPaper

无存储约束的在线持续学习

Online Continual Learning Without the Storage Constraint

May 16, 2023
作者: Ameya Prabhu, Zhipeng Cai, Puneet Dokania, Philip Torr, Vladlen Koltun, Ozan Sener
cs.AI

摘要

在线持续学习(OCL)研究主要集中在减轻灾难性遗忘,通过在整个代理生命周期中固定和有限的存储分配。然而,数据存储成本的不断降低突显了许多应用不符合这些假设。在这些情况下,主要关注点在于管理计算开销而不是存储。本文针对这种情景,通过放宽存储约束并强调固定、有限的经济预算,研究在线持续学习问题。我们提供了一个简单的算法,可以在微小的计算预算下紧凑存储和利用整个传入数据流,使用k最近邻分类器和通用预训练特征提取器。我们的算法提供了一种吸引人的持续学习一致性属性:它永远不会忘记过去看到的数据。我们在两个大规模OCL数据集上树立了新的技术水平:持续定位(CLOC)数据集,包含712类别的39M张图像,以及持续谷歌地标V2(CGLM)数据集,包含10,788类别的580K张图像,击败了在减少过去数据灾难性遗忘和快速适应快速变化数据流方面比我们更高计算预算的方法。我们提供代码以重现我们的结果,网址为https://github.com/drimpossible/ACM。
English
Online continual learning (OCL) research has primarily focused on mitigating catastrophic forgetting with fixed and limited storage allocation throughout the agent's lifetime. However, the growing affordability of data storage highlights a broad range of applications that do not adhere to these assumptions. In these cases, the primary concern lies in managing computational expenditures rather than storage. In this paper, we target such settings, investigating the online continual learning problem by relaxing storage constraints and emphasizing fixed, limited economical budget. We provide a simple algorithm that can compactly store and utilize the entirety of the incoming data stream under tiny computational budgets using a kNN classifier and universal pre-trained feature extractors. Our algorithm provides a consistency property attractive to continual learning: It will never forget past seen data. We set a new state of the art on two large-scale OCL datasets: Continual LOCalization (CLOC), which has 39M images over 712 classes, and Continual Google Landmarks V2 (CGLM), which has 580K images over 10,788 classes -- beating methods under far higher computational budgets than ours in terms of both reducing catastrophic forgetting of past data and quickly adapting to rapidly changing data streams. We provide code to reproduce our results at https://github.com/drimpossible/ACM.
PDF20December 15, 2024