P19-1009@ACL

Total: 1

#1 AMR Parsing as Sequence-to-Graph Transduction [PDF] [Copy] [Kimi1]

Authors: Sheng Zhang ; Xutai Ma ; Kevin Duh ; Benjamin Van Durme

We propose an attention-based model that treats AMR parsing as sequence-to-graph transduction. Unlike most AMR parsers that rely on pre-trained aligners, external semantic resources, or data augmentation, our proposed parser is aligner-free, and it can be effectively trained with limited amounts of labeled AMR data. Our experimental results outperform all previously reported SMATCH scores, on both AMR 2.0 (76.3% on LDC2017T10) and AMR 1.0 (70.2% on LDC2014T12).