Talks

Statistical Methodology

93
reads

Online Learning with Matrix Exponentiated Gradient Updates
Dr. Hsin-Hsiung Huang ( Institute of Statistical Science , Academia Sinica )

2008 - 10 - 03 (Fri.)
13:30 - 15:00
404, Freshman Classroom Building

Tsuda, Ratsch and Warmuth (2006) address the problem of learning a symmetric positive definite matrix. They offer kernelized updates which involve a calculation based on matrix logs and matrix exponentials. These updates preserve the symmetry and positive definiteness. On the other hand, Vishwanathan, Schraudolph and Smola (2006) provide an online support vector machine (SVM) that uses the stochastic meta-descent (SMD) algorithm to adapt its step size automatically. Based on their method, we derive updates that allow us to perform the step size adaptation of kernel principal component analysis (PCA). Further, the online kernel PCA is an online SVM framework to loss functions, where its gradient trace parameter is no longer a coefficient vector but an element of the RKHS.



Time:November 20, 2007 - July 31, 2009
Room:
Organizer:Hung Chen ( Department of Mathematics, National Taiwan University )

Available Talk List

2007-11-20
(Tue.)
2007-12-04
(Tue.)
2008-01-08
(Tue.)
2008-01-15
(Tue.)
2008-02-19
(Tue.)
2008-03-04
(Tue.)
2008-03-18
(Tue.)
2008-03-21
(Fri.)
2008-03-21
(Fri.)
2008-04-15
(Tue.)
2008-04-15
(Tue.)
2008-04-29
(Tue.)
2008-05-27
(Tue.)
2008-06-03
(Tue.)
2008-06-10
(Tue.)
2008-06-17
(Tue.)
2008-07-15
(Tue.)
2008-07-29
(Tue.)
2008-08-12
(Tue.)
2008-08-26
(Tue.)
2008-09-09
(Tue.)
2008-09-19
(Fri.)
2008-09-26
(Fri.)
2008-10-03
(Fri.)
2008-10-24
(Fri.)
2008-10-31
(Fri.)
2008-11-07
(Fri.)
2008-11-21
(Fri.)
2008-12-05
(Fri.)
2008-12-17
(Wed.)
2008-12-26
(Fri.)
2009-01-07
(Wed.)
2009-02-20
(Fri.)
2009-03-06
(Fri.)
2009-03-11
(Wed.)
2009-03-20
(Fri.)
2009-04-10
(Fri.)
2009-04-13
(Mon.)
2009-04-24
(Fri.)
2009-04-29
(Wed.)
2009-05-08
(Fri.)
2009-05-22
(Fri.)
2009-05-27
(Wed.)
2009-06-10
(Wed.)
2009-06-12
(Fri.)
2009-09-25
(Fri.)