N19-1022@ACL

Total: 1

#1 Recurrent models and lower bounds for projective syntactic decoding [PDF] [Copy] [Kimi] [REL]

Author: Natalie Schluter

The current state-of-the-art in neural graph-based parsing uses only approximate decoding at the training phase. In this paper aim to understand this result better. We show how recurrent models can carry out projective maximum spanning tree decoding. This result holds for both current state-of-the-art models for shift-reduce and graph-based parsers, projective or not. We also provide the first proof on the lower bounds of projective maximum spanning tree decoding.