2023.findings-acl.3@ACL

Total: 1

#1 Conformal Nucleus Sampling [PDF] [Copy] [Kimi2]

Authors: Shauli Ravfogel ; Yoav Goldberg ; Jacob Goldberger

Language models generate text based on successively sampling the next word. A decoding procedure based on nucleus (top-p) sampling chooses from the smallest possible set of words whose cumulative probability exceeds the probability p. In this work, we assess whether a top-p set is indeed aligned with its probabilistic meaning in various linguistic contexts.We employ conformal prediction, a calibration procedure that focuses on the construction of minimal prediction sets according to a desired confidence level, to calibrate the parameter p as a function of the entropy of the next word distribution. We find that OPT models are overconfident, and that calibration shows a moderate inverse scaling with model size.