1v3XEcRMyP@OpenReview

Total: 1

#1 Iterative Vectors: In-Context Gradient Steering without Backpropagation [PDF] [Copy] [Kimi1] [REL]

Authors: Yiting Liu, Zhi-Hong Deng

In-context learning has become a standard approach for utilizing language models.However, selecting and processing suitable demonstration examples can be challenging and time-consuming, especially when dealing with large numbers of them.We propose Iterative Vectors (IVs), a technique that explores activation space to enhance in-context performance by simulating gradient updates during inference.IVs extract and iteratively refine activation-based meta-gradients, applying them during inference without requiring backpropagation at any stage.We evaluate IVs across various tasks using four popular models and observe significant improvements.Our findings suggest that in-context activation steering is a promising direction, opening new avenues for future research.

Subject: ICML.2025 - Poster