wang07d@interspeech_2007@ISCA

Total: 1

#1 Integrating MAP, marginals, and unsupervised language model adaptation [PDF] [Copy] [Kimi]

Authors: Wen Wang ; Andreas Stolcke

We investigate the integration of various language model adaptation approaches for a cross-genre adaptation task to improve Mandarin ASR system performance on a recently introduced new genre, broadcast conversation (BC). Various language model adaptation strategies are investigated and their efficacies are evaluated based on ASR performance, including unsupervised language model adaptation from ASR transcripts and ways to integrate supervised Maximum A Posteriori (MAP) and marginal adaptation within the unsupervised adaptation framework. We found that by effectively combining these adaptation approaches, we can achieve as much as 1.3% absolute gain (6% relative) on the final recognition error rate in the BC genre.