lLGqtDcPag@OpenReview

Total: 1

#1 An Adaptive Orthogonal Convolution Scheme for Efficient and Flexible CNN Architectures [PDF] [Copy] [Kimi] [REL]

Authors: Thibaut Boissin, Franck Mamalet, Thomas Fel, Agustin Picard, Thomas Massena, Mathieu Serrurier

Orthogonal convolutional layers are valuable components in multiple areas of machine learning, such as adversarial robustness, normalizing flows, GANs, and Lipschitz-constrained models. Their ability to preserve norms and ensure stable gradient propagation makes them valuable for a large range of problems. Despite their promise, the deployment of orthogonal convolution in large-scale applications is a significant challenge due to computational overhead and limited support for modern features like strides, dilations, group convolutions, and transposed convolutions. In this paper, we introduce **AOC** (Adaptive Orthogonal Convolution), a scalable method that extends a previous method (BCOP), effectively overcoming existing limitations in the construction of orthogonal convolutions. This advancement unlocks the construction of architectures that were previously considered impractical. We demonstrate through our experiments that our method produces expressive models that become increasingly efficient as they scale. To foster further advancement, we provide an open-source python package implementing this method, called **Orthogonium**.

Subject: ICML.2025 - Poster