Variational optimization on lie groups, with examples of leading (generalized) eigenvalue problems

M Tao, T Ohsawa - International Conference on Artificial …, 2020 - proceedings.mlr.press
International Conference on Artificial Intelligence and Statistics, 2020proceedings.mlr.press
The article considers smooth optimization of functions on Lie groups. By generalizing NAG
variational principle in vector space (Wibisono et al., 2016) to general Lie groups,
continuous Lie-NAG dynamics which are guaranteed to converge to local optimum are
obtained. They correspond to momentum versions of gradient flow on Lie groups. A
particular case of $\SO (n) $ is then studied in details, with objective functions corresponding
to leading Generalized EigenValue problems: the Lie-NAG dynamics are first made explicit …
Abstract
The article considers smooth optimization of functions on Lie groups. By generalizing NAG variational principle in vector space (Wibisono et al., 2016) to general Lie groups, continuous Lie-NAG dynamics which are guaranteed to converge to local optimum are obtained. They correspond to momentum versions of gradient flow on Lie groups. A particular case of $\SO (n) $ is then studied in details, with objective functions corresponding to leading Generalized EigenValue problems: the Lie-NAG dynamics are first made explicit in coordinates, and then discretized in structure preserving fashions, resulting in optimization algorithms with faithful energy behavior (due to conformal symplecticity) and exactly remaining on the Lie group. Stochastic gradient versions are also investigated. Numerical experiments on both synthetic data and practical problem (LDA for MNIST) demonstrate the effectiveness of the proposed methods as optimization algorithms (\emph {not} as a classification method).
proceedings.mlr.press
以上显示的是最相近的搜索结果。 查看全部搜索结果