Y9nxhKcgAA@OpenReview

Total: 1

#1 On the VC dimension of deep group convolutional neural networks [PDF] [Copy] [Kimi] [REL]

Authors: Anna Sepliarskaia, Sophie Langer, Johannes Schmidt-Hieber

Recent works have introduced new equivariant neural networks, motivated by their improved generalization compared to traditional deep neural networks. While experiments support this advantage, the theoretical understanding of their generalization properties remains limited. In this paper, we analyze the generalization capabilities of Group Convolutional Neural Networks (GCNNs) with the ReLU activation function through the lens of Vapnik-Chervonenkis (VC) dimension theory. We investigate how architectural factors—such as the number of layers, weights, and input dimensions—affect the VC dimension. A key challenge in our analysis is proving a lower bound on the VC dimension, for which we introduce new techniques, establishing a novel connection between GCNNs and standard deep neural networks. Additionally, we compare our derived bounds to those known for fully connected neural networks. Our results extend previous findings on the VC dimension of continuous GCNNs with two layers, offering new insights into their generalization behavior, particularly their dependence on input resolution.

Subject: NeurIPS.2025 - Poster