Multiclass Losses and Multidistribution divergences
Bob Williamson (NICTA)
NICTA SML SEMINARDATE: 2012-11-29
TIME: 11:15:00 - 12:00:00
LOCATION: NICTA - 7 London Circuit
CONTACT: JavaScript must be enabled to display this email address.
ABSTRACT:
Binary prediction problems (and their associated loss functions) are perhaps the simplest machine learning problems and have been extensively studied. Similarly, divergence measures between two probability distributions are well understood, for example the classical Csiszar f-divergences. There is a natural bridge between binary proper losses and f-divergences via the Bayes risk of a binary learning problem induced by the loss. Multiclass prediction problems and multiclass loss functions are less well understood. It is not even clear (at first) what the "divergence" between k distributions even means when k>2 In this talk I will show how the binary "bridge" extends to the multiclass case and allows simple proofs of the properties of multidistribution f-divergences which are analogous to those satisfied by the classical f-divergences. I will also outline the theory of composite multiclass losses, which are the composition of a proper loss with a link function, including a characterisation of when they are convex. (Joint work with Dario Garcia-Garcia, Mark Reid, and Elodie Vernet)
BIO:
