padrqCRyCP@OpenReview

Total: 1

#1 Tight Asymptotics of Extreme Order Statistics [PDF] [Copy] [Kimi] [REL]

Authors: Matias Romero, Frederik Mallmann-Trenn, Jose Correa

A classic statistical problem is to study the asymptotic behavior of the order statistics of a large number of independent samples taken from a distribution with finite expectation. This behavior has implications for several core problems in machine learning and economics—including robust learning under adversarial noise, best-arm identification in bandit algorithms, revenue estimation in second- price auctions, and the analysis of tail-sensitive statistics used in out-of-distribution detection. The research question we tackle in this paper is: How large can the expectation of the $\ell$-th maximum of the $n$ samples be? For $\ell=1$, i.e., the maximum, this expectation is known to grow as $o(n)$, which can be shown to be tight. We show that there is a sharp contrast when considering any fixed $\ell>1$. Surprisingly, in this case, the largest possible growth rate for all fixed $\ell>1$ is $O(\frac{n}{\log(n)\log\log(n)})$ and $\Omega(\frac{n}{\log(n)(\log\log(n))^{1.01}})$. Our result is actually finer than the latter and provides a sharp characterization of the largest achievable growth rate for the expectation of the $\ell$-th maximum of $n$ i.i.d. samples. Beyond the theoretical analysis, we support our findings with extensive simulations. These empirical results highlight a notable phenomenon: although the multiplicative gap between the maximum and the second maximum grows quickly with $n$, the ratio remains approximately constant in 99\% of trials. This suggests that while worst-case growth is sharp and meaningful, typical-case behavior may be significantly more stable.

Subject: NeurIPS.2025 - Poster