f6AYwCvynr@OpenReview

Total: 1

#1 Neural Entropy [PDF3] [Copy] [Kimi1] [REL]

Author: Akhil Premkumar

We explore the connection between deep learning and information theory through the paradigm of diffusion models. A diffusion model converts noise into structured data by reinstating, imperfectly, information that is erased when data was diffused to noise. This information is stored in a neural network during training. We quantify this information by introducing a measure called \textit{neural entropy}, which is related to the total entropy produced by diffusion. Neural entropy is a function of not just the data distribution, but also the diffusive process itself. Measurements of neural entropy on a few simple image diffusion models reveal that they are extremely efficient at compressing large ensembles of structured data.

Subject: NeurIPS.2025 - Spotlight