aixLLnS70r@OpenReview

Total: 1

#1 Fast Zeroth-Order Convex Optimization with Quantum Gradient Methods [PDF] [Copy] [Kimi] [REL]

Authors: Junhyung Lyle Kim, Brandon Augustino, Dylan Herman, Enrico Fontana, Jacob Watkins, Marco Pistoia, Shouvanik Chakrabarti

We study quantum algorithms based on quantum (sub)gradient estimation using noisy function evaluation oracles, and demonstrate the first dimension-independent query complexities (up to poly-logarithmic factors) for zeroth-order convex optimization in both smooth and nonsmooth settings. Interestingly, only using noisy function evaluation oracles, we match the first-order query complexities of classical gradient descent, thereby exhibiting exponential separation between quantum and classical zeroth-order optimization. We then generalize these algorithms to work in non-Euclidean settings by using quantum (sub)gradient estimation to instantiate mirror descent and its variants, including dual averaging and mirror prox. By leveraging a connection between semidefinite programming and eigenvalue optimization, we use our quantum mirror descent method to give a new quantum algorithm for solving semidefinite programs, linear programs, and zero-sum games. We identify a parameter regime in which our zero-sum games algorithm is faster than any existing classical or quantum approach.

Subject: NeurIPS.2025 - Poster