IGKluZvI6J@OpenReview

Total: 1

#1 Nonparametric Quantile Regression with ReLU-Activated Recurrent Neural Networks [PDF] [Copy] [Kimi] [REL]

Authors: Hang Yu, Lyumin Wu, Wenxin Zhou, Zhao Ren

This paper investigates nonparametric quantile regression using recurrent neural networks (RNNs) and sparse recurrent neural networks (SRNNs) to approximate the conditional quantile function, which is assumed to follow a compositional hierarchical interaction model. We show that RNN- and SRNN-based estimators with rectified linear unit (ReLU) activation and appropriately designed architectures achieve the optimal nonparametric convergence rate, up to a logarithmic factor, under stationary, exponentially $\boldsymbol{\beta}$-mixing processes. To establish this result, we derive sharp approximation error bounds for functions in the hierarchical interaction model using RNNs and SRNNs, exploiting their close connection to sparse feedforward neural networks (SFNNs). Numerical experiments and an empirical study on the Dow Jones Industrial Average (DJIA) further support our theoretical findings.

Subject: NeurIPS.2025 - Poster