vu12@interspeech_2012@ISCA

Total: 1

#1 Initialization schemes for multilayer perceptron training and their impact on ASR performance using multilingual data [PDF] [Copy] [Kimi1]

Authors: Ngoc Thang Vu ; Wojtek Breiter ; Florian Metze ; Tanja Schultz

In this paper we present our latest investigation on initialization schemes for Multilayer Perceptron (MLP) training using multilingual data. We show that the overall performance of an MLP network improves significantly by initializing it with a multilingual MLP. We propose a new strategy called "open target language" MLP to train more flexible models for language adaptation, which is particularly suited for small amounts of training data. Furthermore, by applying Bottle-Neck feature (BN) initialized with multilingual MLP the ASR performance increases on both, on those languages which were used for multilingual MLP training, and on a new language. Our experiments show word error rate improvements of up to 16.9% relative on a range of tasks for different target languages (Creole and Vietnamese) with manually and automatically transcribed training data.