ChatPaper.aiChatPaper

言语斯莱特林:审视Mamba在言语分离、识别和合成中的性能和效率。

Speech Slytherin: Examining the Performance and Efficiency of Mamba for Speech Separation, Recognition, and Synthesis

July 13, 2024
作者: Xilin Jiang, Yinghao Aaron Li, Adrian Nicolas Florea, Cong Han, Nima Mesgarani
cs.AI

摘要

在将Mamba与变压器在多个与语音相关的任务中的性能和效率进行比较之前,现在得出Mamba是变压器的更好替代品还为时过早。为了得出这个结论,我们提出并评估了三个任务的三个模型:用于语音分离的Mamba-TasNet,用于语音识别的ConMamba,以及用于语音合成的VALL-M。我们将它们与相似规模的变压器在性能、内存和速度上进行比较。我们的Mamba或Mamba-变压器混合模型在性能上显示出与它们的变压器对应物Sepformer、Conformer和VALL-E相当或更高的表现:它们在内存和速度上比变压器更有效,适用于超过某一阈值持续时间的语音,与语音令牌的分辨率成反比。Mamba用于分离是最有效的,用于识别的最不有效。此外,我们展示了Mamba在短于阈值持续时间的语音中并不比变压器更有效,并且在需要联合建模文本和语音的模型中表现更差,比如两个输入的交叉或掩蔽注意力。因此,我们认为Mamba或变压器的优越性取决于特定的问题和模型。代码可在https://github.com/xi-j/Mamba-TasNet 和 https://github.com/xi-j/Mamba-ASR找到。
English
It is too early to conclude that Mamba is a better alternative to transformers for speech before comparing Mamba with transformers in terms of both performance and efficiency in multiple speech-related tasks. To reach this conclusion, we propose and evaluate three models for three tasks: Mamba-TasNet for speech separation, ConMamba for speech recognition, and VALL-M for speech synthesis. We compare them with transformers of similar sizes in performance, memory, and speed. Our Mamba or Mamba-transformer hybrid models show comparable or higher performance than their transformer counterparts: Sepformer, Conformer, and VALL-E. They are more efficient than transformers in memory and speed for speech longer than a threshold duration, inversely related to the resolution of a speech token. Mamba for separation is the most efficient, and Mamba for recognition is the least. Further, we show that Mamba is not more efficient than transformer for speech shorter than the threshold duration and performs worse in models that require joint modeling of text and speech, such as cross or masked attention of two inputs. Therefore, we argue that the superiority of Mamba or transformer depends on particular problems and models. Code available at https://github.com/xi-j/Mamba-TasNet and https://github.com/xi-j/Mamba-ASR.

Summary

AI-Generated Summary

PDF102November 28, 2024