2024.iwslt-1.5@ACL

Total: 1

#1 Conditioning LLMs with Emotion in Neural Machine Translation [PDF1] [Copy] [Kimi] [REL]

Authors: Charles Brazier ; Jean-Luc Rouas

Large Language Models (LLMs) have shown remarkable performance in Natural Language Processing tasks, including Machine Translation (MT). In this work, we propose a novel MT pipeline that integrates emotion information extracted from a Speech Emotion Recognition (SER) model into LLMs to enhance translation quality. We first fine-tune five existing LLMs on the Libri-trans dataset and select the most performant model. Subsequently, we augment LLM prompts with different dimensional emotions and train the selected LLM under these different configurations. Our experiments reveal that integrating emotion information, especially arousal, into LLM prompts leads to notable improvements in translation quality.