2025.findings-acl.1048@ACL

Total: 1

#1 SynapticRAG: Enhancing Temporal Memory Retrieval in Large Language Models through Synaptic Mechanisms [PDF] [Copy] [Kimi] [REL]

Authors: Yuki Hou, Haruki Tamoto, Qinghua Zhao, Homei Miyashita

Existing retrieval methods in Large Language Models show degradation in accuracy when handling temporally distributed conversations, primarily due to their reliance on simple similarity-based retrieval. Unlike existing memory retrieval methods that rely solely on semantic similarity, we propose SynapticRAG, which uniquely combines temporal association triggers with biologically-inspired synaptic propagation mechanisms. Our approach uses temporal association triggers and synaptic-like stimulus propagation to identify relevant dialogue histories. A dynamic leaky integrate-and-fire mechanism then selects the most contextually appropriate memories. Experiments on four datasets of English, Chinese and Japanese show that compared to state-of-the-art memory retrieval methods, SynapticRAG achieves consistent improvements across multiple metrics up to 14.66% points. This work bridges the gap between cognitive science and language model development, providing a new framework for memory management in conversational systems.

Subject: ACL.2025 - Findings