41027@AAAI

Total: 1

#1 EvoGrad: Evolutionary-Weighted Gradient and Hessian Learning for Black-Box Optimization [PDF] [Copy] [Kimi] [REL]

Authors: Yedidya Kfir, Elad Sarafian, Yoram Louzoun, Sarit Kraus

Black-box algorithms aim to optimize functions without access to their analytical structure or gradient information, making them essential when gradients are unavailable or computationally expensive to obtain. Traditional methods for black-box optimization (BBO) primarily utilize non-parametric models, but these approaches often struggle to scale effectively in large input spaces. Conversely, parametric approaches, which rely on neural estimators and gradient signals via backpropagation, frequently encounter substantial gradient estimation errors, limiting their reliability. Explicit Gradient Learning (EGL), a recent advancement, directly learns gradients using a first-order Taylor approximation and has demonstrated superior performance compared to both parametric and non-parametric methods. However, EGL inherently remains local and myopic, often faltering on highly non-convex optimization landscapes. In this work, we address this limitation by integrating global statistical insights from the evolutionary algorithm CMA-ES into the gradient learning framework, effectively biasing gradient estimates towards regions with higher optimization potential. Moreover, we enhance the gradient learning process by estimating the Hessian matrix, allowing us to correct the second-order residual of the Taylor series approximation. Our proposed algorithm, EvoGrad2 (Evolutionary Gradient Learning with second-order approximation), achieves state-of-the-art results on the synthetic COCO test suite, exhibiting significant advantages in high-dimensional optimization problems. We further demonstrate EvoGrad2's effectiveness on challenging real-world machine learning tasks, including adversarial training and code generation, highlighting its ability to produce more robust, high-quality solutions. Our results underscore EvoGrad2's potential as a powerful tool for researchers and practitioners facing complex, high-dimensional, and non-linear optimization problems.

Subject: AAAI.2026 - Search and Optimization