2024.acl-short.14@ACL

Total: 1

#1 Bi-Directional Multi-Granularity Generation Framework for Knowledge Graph-to-Text with Large Language Model [PDF12] [Copy] [Kimi11] [REL]

Authors: Haowei Du ; Chen Li ; Dinghao Zhang ; Dongyan Zhao

The knowledge graph-to-text (KG-to-text) generation task aims to synthesize coherent and engaging sentences that accurately convey the complex information derived from an input knowledge graph. Existing methods generate the whole target text based on all KG triples at once and may incorporate incorrect KG triples for each sentence. To this end, we propose the bi-directional multi-granularity generation framework. Instead of generating the whole text at a time, we construct the sentence level generation based on the corresponding triples and generate the graph-level text as a result. Moreover, we design a backward relation extraction task to enhance the correctness of relational information. Our method achieves the new state-of-the-art in benchmark dataset WebNLG and further analysis shows the efficiency of different modules.