2021.naacl-industry.18@ACL

Total: 1

#1 Language Scaling for Universal Suggested Replies Model [PDF] [Copy] [Kimi] [REL]

Authors: Qianlan Ying ; Payal Bajaj ; Budhaditya Deb ; Yu Yang ; Wei Wang ; Bojia Lin ; Milad Shokouhi ; Xia Song ; Yang Yang ; Daxin Jiang

We consider the problem of scaling automated suggested replies for a commercial email application to multiple languages. Faced with increased compute requirements and low language resources for language expansion, we build a single universal model for improving the quality and reducing run-time costs of our production system. However, restricted data movement across regional centers prevents joint training across languages. To this end, we propose a multi-lingual multi-task continual learning framework, with auxiliary tasks and language adapters to train universal language representation across regions. The experimental results show positive cross-lingual transfer across languages while reducing catastrophic forgetting across regions. Our online results on real user traffic show significant CTR and Char-saved gain as well as 65% training cost reduction compared with per-language models. As a consequence, we have scaled the feature in multiple languages including low-resource markets.