Talks

Boosting leaning algorithm and U-loss functions II

68
reads

Shinto Eguchi & Osamu Komori 

2010-04-25 
09:30  - 10:30 

308 , Mathematics Research Center Building (ori. New Math. Bldg.)



Kullback-Leibler divergence leads to log and exponential losses, which is associated with LogitBoost and AdaBoost, respectively. We propose U-loss function by applying U-divergence to a context of pattern recognition. Similarly, U-loss function gives two-type boosting algorithm, called U-Boost. We consider statistical performance focusing on Bayes risk consistency. SVM-type implementation is given by U-loss functions with a few of variants.

For material related to this talk, click here.