yang15b@interspeech_2015@ISCA

Total: 1

#1 Dialog state tracking using long short-term memory neural networks [PDF] [Copy] [Kimi1]

Authors: Xiaohao Yang ; Jia Liu

Neural network based approaches have recently shown state-of-art performance in the Dialog State Tracking Challenge (DSTC). In DSTC, a tracker is used to assign a label to the state at each moment in an input sequence of a dialog. Specifically, deep neural networks (DNNs) and simple recurrent neural networks (RNNs) have significantly improved the performance of the dialog state tracking. In this paper, we investigate exploiting long short-term memory (LSTM) neural networks, which contain forgetting, input and output gates and are more advanced than simple RNNs, for the dialog state tracking task. To explicitly model the dependence of the output labels, we propose two different models on top of the LSTM un-normalized scores. One is a regression model, the other is a conditional random field (CRF) model. We also apply a deep LSTM to the task. The method is evaluated on the second Dialog State Tracking Challenge (DSTC2) corpus and the results demonstrate that our proposed models can improve the performances of the task.