10206@AAAI

Total: 1

#1 Noise-Adaptive Margin-Based Active Learning and Lower Bounds under Tsybakov Noise Condition [PDF] [Copy] [Kimi]

Authors: Yining Wang ; Aarti Singh

We present a simple noise-robust margin-based active learn-ing algorithm to find homogeneous (passing the origin) linearseparators and analyze its error convergence when labels arecorrupted by noise. We show that when the imposed noisesatisfies the Tsybakov low noise condition (Mammen, Tsy-bakov, and others 1999; Tsybakov 2004) the algorithm is ableto adapt to unknown level of noise and achieves optimal sta-tistical rate up to polylogarithmic factors. We also derive lower bounds for margin based active learningalgorithms under Tsybakov noise conditions (TNC) for themembership query synthesis scenario (Angluin 1988). Ourresult implies lower bounds for the stream based selectivesampling scenario (Cohn 1990) under TNC for some fairlysimple data distributions. Quite surprisingly, we show that thesample complexity cannot be improved even if the underly-ing data distribution is as simple as the uniform distributionon the unit ball. Our proof involves the construction of a well-separated hypothesis set on the d-dimensional unit ball alongwith carefully designed label distributions for the Tsybakovnoise condition. Our analysis might provide insights for otherforms of lower bounds as well.