Abstract
We consider the binary supervised classification problem with the Gaussian functional model introduced in [7]. Taking advantage of the Gaussian structure, we design a natural plug-in classifier and derive a family of upper bounds on its worst-case excess risk over Sobolev spaces. These bounds are parametrized by a separation distance quantifying the difficulty of the problem, and are proved to be optimal (up to logarithmic factors) through matching minimax lower bounds. Using the recent works of [9] and [14] we also derive a logarithmic lower bound showing that the popular k-nearest neighbors classifier is far from optimality in this specific functional setting.
Replaced by
Sébastien Gadat, Sebastien Gerchinovitz, and Clément Marteau, “Optimal functional supervised classification with separation condition”, Bernoulli, vol. 26, n. 3, 2020, p. 1797–1831.
Reference
Sébastien Gadat, Sebastien Gerchinovitz, and Clément Marteau, “Optimal functional supervised classification with separation condition”, TSE Working Paper, n. 18-904, March 2018.
See also
Published in
TSE Working Paper, n. 18-904, March 2018