ChatPaper.aiChatPaper

基于有限数据的MEG到MEG迁移学习及跨任务语音/静默检测 (注:MEG指脑磁图,是神经科学领域专业术语,在学术文献中通常直接使用英文缩写或译为"脑磁图"。根据上下文判断,此处保持"MEG"原缩写形式更符合专业文献惯例。标题采用学术论文常见的"及"字连接并列结构,既保持专业性与简洁性,又符合中文表达习惯。"Limited Data"译为"有限数据"是学界通用译法,"Transfer Learning"和"Cross-Task"分别对应"迁移学习"和"跨任务"的标准术语。)

MEG-to-MEG Transfer Learning and Cross-Task Speech/Silence Detection with Limited Data

February 20, 2026
作者: Xabier de Zuazo, Vincenzo Verbeni, Eva Navas, Ibon Saratxaga, Mathieu Bourguignon, Nicola Molinaro
cs.AI

摘要

数据高效的神经解码是语音脑机接口领域的核心挑战。本研究首次实现了基于脑磁图的语音模型在感知与产生双任务间的迁移学习与跨任务解码。我们采用Conformer架构,在单被试50小时的听觉数据上进行预训练,随后对18名被试每人仅用5分钟数据进行微调。迁移学习带来了持续的性能提升:任务内准确率提高1-4%,跨任务解码提升幅度更大,达5-6%。预训练不仅提升了各任务内部的表现,更实现了感知与产生任务间的可靠跨任务解码。关键发现是,基于语音产生训练的模型对被动听觉任务也能实现超随机水平的解码,这证实了所学表征反映的是共享的神经加工过程,而非任务特定的运动活动。
English
Data-efficient neural decoding is a central challenge for speech brain-computer interfaces. We present the first demonstration of transfer learning and cross-task decoding for MEG-based speech models spanning perception and production. We pre-train a Conformer-based model on 50 hours of single-subject listening data and fine-tune on just 5 minutes per subject across 18 participants. Transfer learning yields consistent improvements, with in-task accuracy gains of 1-4% and larger cross-task gains of up to 5-6%. Not only does pre-training improve performance within each task, but it also enables reliable cross-task decoding between perception and production. Critically, models trained on speech production decode passive listening above chance, confirming that learned representations reflect shared neural processes rather than task-specific motor activity.
PDF12February 28, 2026