2025.emnlp-industry.142@ACL

Total: 1

#1 LLM Agents Implement an NLG System from Scratch: Building Interpretable Rule-Based RDF-to-Text Generators [PDF] [Copy] [Kimi] [REL]

Authors: Mateusz Lango, Ondrej Dusek

We present a novel neurosymbolic framework for RDF-to-text generation, in which the model is “trained” through collaborative interactions among multiple LLM agents rather than traditional backpropagation. The LLM agents produce rule-based Python code for a generator for the given domain, based on RDF triples only, with no in-domain human reference texts. The resulting system is fully interpretable, requires no supervised training data, and generates text nearly instantaneously using only a single CPU. Our experiments on the WebNLG and OpenDialKG data show that outputs produced by our approach reduce hallucination, with only slight fluency penalties compared to finetuned or prompted language models.

Subject: EMNLP.2025 - Industry Track