zgeoOFyIyb@OpenReview

Total: 1

#1 Enhancing Parallelism in Decentralized Stochastic Convex Optimization [PDF1] [Copy] [Kimi2] [REL]

Authors: Ofri Eisen, Ron Dorfman, Kfir Levy

Decentralized learning has emerged as a powerful approach for handling large datasets across multiple machines in a communication-efficient manner. However, such methods often face scalability limitations, as increasing the number of machines beyond a certain point negatively impacts convergence rates. In this work, we propose *Decentralized Anytime SGD*, a novel decentralized learning algorithm that significantly extends the critical parallelism threshold, enabling the effective use of more machines without compromising performance. Within the stochastic convex optimization (SCO) framework, we establish a theoretical upper bound on parallelism that surpasses the current state-of-the-art, allowing larger networks to achieve favorable statistical guarantees and closing the gap with centralized learning in highly connected topologies.

Subject: ICML.2025 - Poster