0398lUtZqs@OpenReview

Total: 1

#1 Reproducing Kernel Banach Space Models for Neural Networks with Application to Rademacher Complexity Analysis [PDF] [Copy] [Kimi] [REL]

Authors: Alistair Shilton, Sunil Gupta, Santu Rana, Svetha Venkatesh

This paper explores the use of Hermite transform based reproducing kernel Banach space methods to construct exact or un-approximated models of feedforward neural networks of arbitrary width, depth and topology, including ResNet and Transformers networks, assuming only a feedforward topology, finite energy activations and finite (spectral-) norm weights and biases. Using this model, two straightforward but surprisingly tight bounds on Rademacher complexity are derived, precisely (1) a general bound that is width-independent and scales exponentially with depth; and (2) a width- and depth-independent bound for networks with appropriately constrained (below threshold) weights and biases.

Subject: NeurIPS.2025 - Poster