yu16@interspeech_2016@ISCA

Total: 1

#1 Deep Convolutional Neural Networks with Layer-Wise Context Expansion and Attention [PDF] [Copy] [Kimi1]

Authors: Dong Yu ; Wayne Xiong ; Jasha Droppo ; Andreas Stolcke ; Guoli Ye ; Jinyu Li ; Geoffrey Zweig

In this paper, we propose a deep convolutional neural network (CNN) with layer-wise context expansion and location-based attention, for large vocabulary speech recognition. In our model each higher layer uses information from broader contexts, along both the time and frequency dimensions, than its immediate lower layer. We show that both the layer-wise context expansion and the location-based attention can be implemented using the element-wise matrix product and the convolution operation. For this reason, contrary to other CNNs, no pooling operation is used in our model. Experiments on the 309hr Switchboard task and the 375hr short message dictation task indicates that our model outperforms both the DNN and LSTM significantly.