# On algorithm design for constrained optimization problems in machine learning

## Speaker:

## Speaker Link:

## Institution:

## Time:

## Host:

## Location:

In this talk, I will focus on resolution of two important subclasses of constrained optimization: bound-constrained problems and linear programming. They are motivated by popular machine learning topics including nonnegative matrix factorization and optimal transport (OT). To resolve the former subclass, I will introduce a two-metric projection method which effectively exploit Hessian information of the objective function. This method inspires several algorithms including a projected Newton-CG equipped with optimal worst-case complexity guarantees, and an adaptive two-metric projection method designed to address l1-norm regularization. For the linear programming formulation of OT, I will discuss random block coordinate descent (RBCD) methods. A direct advantage of these methods is to save memory and we demonstrate its efficiency by comparison with competitors including the classical Sinkhorn algorithm.