irie18@interspeech_2018@ISCA

Total: 1

#1 Investigation on Estimation of Sentence Probability by Combining Forward, Backward and Bi-directional LSTM-RNNs [PDF] [Copy] [Kimi1]

Authors: Kazuki Irie ; Zhihong Lei ; Liuhui Deng ; Ralf Schlüter ; Hermann Ney

A combination of forward and backward long short-term memory (LSTM) recurrent neural network (RNN) language models is a popular model combination approach to improve the estimation of the sequence probability in the second pass N-best list rescoring in automatic speech recognition (ASR). In this work, we further push such an idea by proposing a combination of three models: a forward LSTM language model, a backward LSTM language model and a bi-directional LSTM based gap completion model. We derive such a combination method from a forward backward decomposition of the sequence probability. We carry out experiments on the Switchboard speech recognition task. While we empirically find that such a combination gives slight improvements in perplexity over the combination of forward and backward models, we finally show that a combination of the same number of forward models gives the best perplexity and word error rate (WER) overall.