高效单调多头注意力
Efficient Monotonic Multihead Attention
December 7, 2023
作者: Xutai Ma, Anna Sun, Siqi Ouyang, Hirofumi Inaguma, Paden Tomasello
cs.AI
摘要
我们介绍了高效单调多头注意力(EMMA),这是一种最先进的同时翻译模型,具有数值稳定且无偏的单调对齐估计。此外,我们提出了改进的训练和推断策略,包括从离线翻译模型同时微调以及减少单调对齐方差。实验结果表明,所提出的模型在西班牙语和英语翻译任务的同时语音转文本翻译中达到了最先进的性能。
English
We introduce the Efficient Monotonic Multihead Attention (EMMA), a
state-of-the-art simultaneous translation model with numerically-stable and
unbiased monotonic alignment estimation. In addition, we present improved
training and inference strategies, including simultaneous fine-tuning from an
offline translation model and reduction of monotonic alignment variance. The
experimental results demonstrate that the proposed model attains
state-of-the-art performance in simultaneous speech-to-text translation on the
Spanish and English translation task.