EGK487IYAW@OpenReview

Total: 1

#1 One Filters All: A Generalist Filter For State Estimation [PDF] [Copy] [Kimi] [REL]

Authors: Shiqi Liu, Wenhan Cao, Chang Liu, Zeyu He, Tianyi Zhang, Yinuo Wang, Shengbo Eben Li

Estimating hidden states in dynamical systems, also known as optimal filtering, is a long-standing problem in various fields of science and engineering. In this paper, we introduce a general filtering framework, $\textbf{LLM-Filter}$, which leverages large language models (LLMs) for state estimation by embedding noisy observations with text prototypes. In a number of experiments for classical dynamical systems, we find that first, state estimation can significantly benefit from the knowledge embedded in pre-trained LLMs. By achieving proper modality alignment with the frozen LLM, LLM-Filter outperforms the state-of-the-art learning-based approaches. Second, we carefully design the prompt structure, System-as-Prompt (SaP), incorporating task instructions that enable LLMs to understand tasks and adapt to specific systems. Guided by these prompts, LLM-Filter exhibits exceptional generalization, capable of performing filtering tasks accurately in changed or even unseen environments. We further observe a scaling-law behavior in LLM-Filter, where accuracy improves with larger model sizes and longer training times. These findings make LLM-Filter a promising foundation model of filtering.

Subject: NeurIPS.2025 - Poster