30433@AAAI

Total: 1

#1 Improving Faithfulness in Abstractive Text Summarization with EDUs Using BART (Student Abstract) [PDF] [Copy] [Kimi]

Authors: Narjes Delpisheh ; Yllias Chali

Abstractive text summarization uses the summarizer’s own words to capture the main information of a source document in a summary. While it is more challenging to automate than extractive text summarization, recent advancements in deep learning approaches and pre-trained language models have improved its performance. However, abstractive text summarization still has issues such as unfaithfulness. To address this problem, we propose a new approach that utilizes important Elementary Discourse Units (EDUs) to guide BART-based text summarization. Our approach showed the improvement in truthfulness and source document coverage in comparison to some previous studies.