feng22c@interspeech_2022@ISCA

Total: 1

#1 ASR-Robust Natural Language Understanding on ASR-GLUE dataset [PDF] [Copy] [Kimi1]

Authors: Lingyun Feng ; Jianwei Yu ; Yan Wang ; Songxiang Liu ; Deng Cai ; Haitao Zheng

In recent years, with the increasing demand for voice interface applications, more and more attention has been paid to language understanding in speech systems. These speech-based intelligent systems usually comprise an automatic speech recognition (ASR) component and a natural language understanding (NLU) component which takes the output of the ASR component as input. Despite the rapid development of speech recognition over the past few decades, recognition errors are still inevitable, especially in noisy environments. However, the robustness of natural language understanding (NLU) systems to errors introduced by ASR is under-examined. In this paper, we propose three empirical approaches to improve the robustness of the NLU models. The first one is ASR correction which attempts to make error corrections for the mistranscriptions. The later two methods focus on simulating a noisy training scenario to train more robust NLU models. Extensive experimental results and analyses show that the proposed methods can effectively improve the robustness of NLU models.