Publications

カンファレンス (国際) Comparison of Low Complexity Self-Attention Mechanisms for Acoustic Event Detection

Tatsuya Komatsu, Robin Scheilbler

Asia-Pacific Signal and Information Processing Association Annual Summit and Conference 2021 (APSIPA ASC 2021)

2021.12.14

We investigate and compare several low-complexity self-attention mechanisms applied to the problem of acoustic event detection. Self-attention has proved to be all that is needed to make leaps in several domains, but at a computational and memory cost quadratic in the length of the input sequence. This problem has been recently addressed by several works that reduced the complexity: linear attention, top-k attention, clustered attention, attention-free Transformer, and FNet. We replace the conventional self-attention block of an acoustic event detection model by these low complexity ones and evaluate the performance on Task 4 of the DCASE Challenge 2021. We find that at the cost of marginal performance drop the inference time was significantly sped up for sequences 30 s and longer. We conclude that for all practical purposes, one of these low-complexity attention mechanism should be used instead of the conventional one.

Paper : Comparison of Low Complexity Self-Attention Mechanisms for Acoustic Event Detection新しいタブまたはウィンドウで開く (外部サイト)