In this work, we consider the problem of learning a positive semidefinite matrix.
The critical issue is how to preserve positive semidefiniteness during the course
of learning. Our algorithm is mainly inspired by LPBoost  and the general
greedy convex optimization framework of Zhang . We demonstrate the essence
of the algorithm, termed PSDBoost (positive semidefinite Boosting), by focusing
on a few different applications in machine learning. The proposed PSDBoost
algorithm extends traditional Boosting algorithms in that its parameter is a positive
semidefinite matrix with trace being one instead of a classifier. PSDBoost is
based on the observation that any trace-one positive semidefinite matrix can be decomposed
into linear convex combinations of trace-one rank-one matrices, which
serve as base learners of PSDBoost. Numerical experiments are presented.