2022.emnlp-industry.11@ACL

Total: 1

#1 Towards Need-Based Spoken Language Understanding Model Updates: What Have We Learned? [PDF] [Copy] [Kimi2]

Authors: Quynh Do ; Judith Gaspers ; Daniil Sorokin ; Patrick Lehnen

In productionized machine learning systems, online model performance is known to deteriorate over time when there is a distributional drift between offline training and online application data. As a remedy, models are typically retrained at fixed time intervals, implying high computational and manual costs. This work aims at decreasing such costs in productionized, large-scale Spoken Language Understanding systems. In particular, we develop a need-based re-training strategy guided by an efficient drift detector and discuss the arising challenges including system complexity, overlapping model releases, observation limitation and the absence of annotated resources at runtime. We present empirical results on historical data and confirm the utility of our design decisions via an online A/B experiment.