Seminar| Institute of Mathematical Sciences
Time:Friday, September 23th, 2022, 15:30-16:45
Location:R408, IMS;
Speaker: Yingzhou Li, Fudan University
Abstract: Coordinate descent descent methods are considered for eigenvalue problems based on a reformulation of the leading eigenvalue problem as a nonconvex optimization problem. The convergence of several deterministic coordinate methods is analyzed and compared. We also analyze the global convergence property of coordinate gradient descent with random choice of coordinates and stepsizes. Under generic assumptions, we prove that the algorithm iterate will almost surely escape strict saddle points of the objective function. As a result, the algorithm is guaranteed to converge to local minima if all saddle points are strict. Numerical examples of applications to quantum many-body problems demonstrate the efficiency and provide benchmarks of the proposed coordinate descent methods.