2020.findings-emnlp.18@ACL

Total: 1

#1 Mimic and Conquer: Heterogeneous Tree Structure Distillation for Syntactic NLP [PDF] [Copy] [Kimi1]

Authors: Hao Fei ; Yafeng Ren ; Donghong Ji

Syntax has been shown useful for various NLP tasks, while existing work mostly encodes singleton syntactic tree using one hierarchical neural network. In this paper, we investigate a simple and effective method, Knowledge Distillation, to integrate heterogeneous structure knowledge into a unified sequential LSTM encoder. Experimental results on four typical syntax-dependent tasks show that our method outperforms tree encoders by effectively integrating rich heterogeneous structure syntax, meanwhile reducing error propagation, and also outperforms ensemble methods, in terms of both the efficiency and accuracy.