Jsln9ZyMl4@OpenReview

Total: 1

#1 The Cost of Robustness: Tighter Bounds on Parameter Complexity for Robust Memorization in ReLU Nets [PDF] [Copy] [Kimi] [REL]

Authors: Yujun Kim, Chaewon Moon, Chulhee Yun

We study the parameter complexity of robust memorization for ReLU networks: the number of parameters required to interpolate any dataset with $\epsilon$-separation between differently labeled points, while ensuring predictions remain consistent within a $\mu$-ball around each training example. We establish upper and lower bounds on the parameter count as a function of the robustness ratio $\rho = \mu / \epsilon$. Unlike prior work, we provide a fine-grained analysis across the entire range $\rho \in (0,1)$ and obtain tighter upper and lower bounds that improve upon existing results. Our findings reveal that the parameter complexity of robust memorization matches that of non-robust memorization when $\rho$ is small, but grows with increasing $\rho$. As a special case, when the input dimension is comparable to or exceeds the dataset size, our bounds become tight (up to logarithmic factors) across the entire range of $\rho$.

Subject: NeurIPS.2025 - Poster