2021.naacl-srw.7@ACL

Total: 1

#1 Syntax-Based Attention Masking for Neural Machine Translation [PDF] [Copy] [Kimi] [REL]

Authors: Colin McDonald ; David Chiang

We present a simple method for extending transformers to source-side trees. We define a number of masks that limit self-attention based on relationships among tree nodes, and we allow each attention head to learn which mask or masks to use. On translation from English to various low-resource languages, and translation in both directions between English and German, our method always improves over simple linearization of the source-side parse tree and almost always improves over a sequence-to-sequence baseline, by up to +2.1 BLEU.