4QRoLzD11x@OpenReview

Total: 1

#1 From Softmax to Score: Transformers Can Effectively Implement In-Context Denoising Steps [PDF1] [Copy] [Kimi] [REL]

Authors: Paul Rosu, Lawrence Carin, Xiang Cheng

Transformers have emerged as powerful meta-learners, with growing evidence that they implement learning algorithms within their forward pass. We study this phenomenon in the context of denoising, presenting a unified framework that shows Transformers can implement (a) manifold denoising via Laplacian flows, (b) score-based denoising from diffusion models, and (c) a generalized form of anisotropic diffusion denoising. Our theory establishes exact equivalence between Transformer attention updates and these algorithms. Empirically, we validate these findings on image denoising tasks, showing that even simple Transformers can perform robust denoising both with and without context. These results illustrate the Transformer’s flexibility as a denoising meta-learner. Code available at https://github.com/paulrosu11/Transformers_are_Diffusion_Denoisers.

Subject: NeurIPS.2025 - Poster