ULAQ9GmJlo@OpenReview

Total: 1

#1 Natural Perturbations for Black-box Training of Neural Networks by Zeroth-Order Optimization [PDF1] [Copy] [Kimi] [REL]

Authors: Hiroshi Sawada, Kazuo Aoyama, Yuya Hikima

This paper proposes a novel concept of natural perturbations for black-box training of neural networks by zeroth-order optimization. When a neural network is implemented directly in hardware, training its parameters by backpropagation ends up with an inaccurate result due to the lack of detailed internal information. We instead employ zeroth-order optimization, where the sampling of parameter perturbations is of great importance. The sampling strategy we propose maximizes the entropy of perturbations with a regularization that the probability distribution conditioned by the neural network does not change drastically, by inheriting the concept of natural gradient. Experimental results show the superiority of our proposal on diverse datasets, tasks, and architectures.

Subject: ICML.2025 - Poster