First-Order Methods in Convex Optimization: From Discrete to Continuous and Vice-versa
Date:
Recently, continuous-time approach has been widely used to study first-order methods (FOMs) for structured convex optimization problems, such as Nesterov accelerated gradient (NAG), alternating method of multiplies (ADMM) and primal-dual hybrid gradient (PDHG) From the continuous point of view, the discrete iterative sequences actually correspond to the trajectories of some ordinary differential equations (ODEs), and the use of Inertia/Momentum and Gradient Correction usually leads to second-order information in time and space. In this talk, we shall present a unified ODE2OPT framework, which gives a systematic way for deriving the continuous-time ODE of a given FOM and provides also an alternative way for designing and analyzing FOMs by using the tool of Lyapunov functional and numerical analysis.
