Submitted

Randomized Fast Subspace Descent Methods

Long Chen, Xiaozhe Hu, and Huiwen Wu

Submitted

arXiv   Bibtex coming soon

ABSTRACT:

 Randomized Fast Subspace Descent (RFASD) Methods are
developed and analyzed for smooth and non-constraint convex
optimization problems. The efficiency of the method relies on a space
decomposition which is stable in $A$-norm, and meanwhile the condition
number $\kappa_A$ measured in $A$-norm is small. At each iteration,
the subspace is chosen randomly either uniformly or by a probability
proportional to the local Lipschitz constants. Then in each chosen
subspace, a preconditioned gradient descent method is applied. RFASD
converges sublinearly for convex functions and linearly for strongly
convex functions.  Comparing with the randomized block coordinate
descent methods, the convergence of RFASD is faster provided
$\kappa_A$ is small and the subspace decomposition is $A$-stable. This
decomposition for Nesterov's `worst' problem.
improvement is supported by considering a multilevel space