Talks

Boosting leaning algorithm and U-loss functions I

67
reads

Shinto Eguchi 

2010-04-24 
15:00 - 16:00 

308 , Mathematics Research Center Building (ori. New Math. Bldg.)



Kullback-Leibler divergence leads to log and exponential losses, which is associated with LogitBoost and AdaBoost, respectively. We propose U-loss function by applying U-divergence to a context of pattern recognition. Similarly, U-loss function gives two-type boosting algorithm, called U-Boost. We consider statistical performance focusing on Bayes risk consistency. SVM-type implementation is given by U-loss functions with a few of variants.

For material related to this talk, click here.