Posts by Collection

portfolio

Optimal error estimates of a time-spectral method for fractional diffusion problems with low regularity data

A novel dynamical inertial Newton system, which is called Hessian-driven Nesterov accelerated gradient (H-NAG) flow is proposed. Convergence of the continuous trajectory are established via tailored Lyapunov function, and new first-order accelerated optimization methods are proposed from ODE solvers. It is shown that (semi-)implicit schemes can always achieve linear rate and explicit schemes have the optimal(accelerated) rates for convex and strongly convex objectives. In particular, Nesterov’s optimal method is recovered from an explicit scheme for our H-NAG flow. Furthermore, accelerated splitting algorithms for composite optimization problems are also developed.

Optimal error estimates of a time-spectral method for fractional diffusion problems with low regularity data

Published:

A novel dynamical inertial Newton system, which is called Hessian-driven Nesterov accelerated gradient (H-NAG) flow is proposed. Convergence of the continuous trajectory are established via tailored Lyapunov function, and new first-order accelerated optimization methods are proposed from ODE solvers. It is shown that (semi-)implicit schemes can always achieve linear rate and explicit schemes have the optimal(accelerated) rates for convex and strongly convex objectives. In particular, Nesterov’s optimal method is recovered from an explicit scheme for our H-NAG flow. Furthermore, accelerated splitting algorithms for composite optimization problems are also developed.

Recommended citation:
Long, Chen and Hao, Luo. (2019). "First order optimization methods based on Hessian-driven Nesterov accelerated gradient flow" arXiv:1912.09276.
Download Paper

publications

Optimal error estimates of a time-spectral method for fractional diffusion problems with low regularity data

Journal of Scientific Computing, 2022

This paper is devoted to the error analysis of a time-spectral algorithm for fractional diffusion problems of order α (0 <α<1). The solution regularity in the Sobolev space is revisited and new regularity results in the Besov space are established. A time-spectral algorithm is developed which adopts a standard spectral method and a conforming linear finite element method for temporal and spatial discretizations, respectively. Optimal error estimates are derived with nonsmooth data. Particularly, a sharp temporal convergence rate 1 + 2α is shown theoretically and numerically.

Recommended citation:
Hao, Luo and Xiaoping, Xie. (2022). "Optimal error estimates of a time-spectral method for fractional diffusion problems with low regularity data" J. Sci. Comput. 91(14).
Download Paper

A primal-dual flow for affine constrained convex optimization

ESAIM: Control, Optimisation and Calculus of Variations, 2022

We introduce a novel primal-dual flow for affine constrained convex optimization problems. As a modification of the standard saddle-point system, our flow model is proved to possess the exponential decay property, in terms of a tailored Lyapunov function. Then two primal-dual methods are obtained from numerical discretizations of the continuous problem, and global nonergodic linear convergence rate is established via a discrete Lyapunov function. Instead of solving the subproblem of the primal variable, we apply the semi-smooth Newton iteration to the inner problem with respect to the multiplier, provided that there are some additional properties such as semi-smoothness and sparsity. Finally, numerical tests on the linearly constrained l1-l2 minimization and the tot al-variation based image denoising model have been provided.

Recommended citation:
Hao, Luo. (2022). "A primal-dual flow for affine constrained convex optimization" ESAIM Control Optim. Calc. Var. 28.
Download Paper

From differential equation solvers to accelerated first-order methods for convex optimization

Mathematical Programming, 2022

Convergence analysis of accelerated first-order methods for convex optimization problems are developed from the point of view of ordinary differential equation solvers. A new dynamical system, called Nesterov accelerated gradient (NAG) flow, is derived from the connection between acceleration mechanism and A-stability of ODE solvers, and the exponential decay of a tailored Lyapunov function along with the solution trajectory is proved. Numerical discretizations of NAG flow are then considered and convergence rates are established via a discrete Lyapunov function. The proposed differential equation solver approach can not only cover existing accelerated methods, such as FISTA, Güler’s proximal algorithm and Nesterov’s accelerated gradient method, but also produce new algorithms for composite convex optimization that possess accelerated convergence rates. Both the convex and the strongly convex cases are handled in a unified way in our approach.

Recommended citation:
Hao, Luo and Long, Chen. (2022). "From differential equation solvers to accelerated first-order methods for convex optimization" Math. Program. 195.
Download Paper

A universal accelerated primal–dual method for convex optimization problems

Journal of Optimization Theory and Applications, 2024

This work presents a universal accelerated primal–dual method for affinely constrained convex optimization problems. It can handle both Lipschitz and Hölder gradients but does not need to know the smoothness level of the objective function. In line search part, it uses dynamically decreasing parameters and produces approximate Lipschitz constant with moderate magnitude. In addition, based on a suitable discrete Lyapunov function and tight decay estimates of some differential/difference inequalities, a universal optimal mixed-type convergence rate is established. Some numerical tests are provided to confirm the efficiency of the proposed method.

Recommended citation:
Hao, Luo. (2024). "A universal accelerated primal–dual method for convex optimization problems" J. Optim. Theory Appl. 201(1).
Download Paper

A unified differential equation solver approach for separable convex optimization: Splitting, acceleration and nonergodic rate

Mathematics of Compuation, 2025

This paper provides a self-contained ordinary differential equation solver approach for separable con- vex optimization problems. A novel primal-dual dynamical system with built-in time rescaling factors is introduced, and the exponential decay of a tailored Lyapunov function is established. Then several time dis- cretizations of the continuous model are considered and analyzed via a unified discrete Lyapunov function. Moreover, two families of accelerated proximal alternating direction methods of multipliers are obtained, and nonergodic optimal mixed-type convergence rates shall be proved for the primal objective residual, the feasi- bility violation and the Lagrangian gap. Finally, numerical experiments are provided to validate the practical performances.

Recommended citation:
Hao, Luo and Zihang, Zhang. (2025). "A unified differential equation solver approach for separable convex optimization: Splitting, acceleration and nonergodic rate" Math. Comp. 94(352).
Download Paper | Download Slides

talks

Effcient primal-dual algorithms for optimal transport-like problems

Published:

In this report, we focus on effcient algorithms for solving a large class of optimal transport-like problems. We will first introduce some primal-dual flow models and equip them with proper Lyapunov functions that possess exponential decay property. Then, based on time discretizations of the continuous dynamics, we obtain new primal-dual methods and prove the nonergodic ((super-)linear or sublinear)convergence rates. Besides, by exploring the special structure, the inner problems of the proposed methods either admit closed solution or can be solved by the semi-smooth Newton iteration with algebraic multigrid method. Moreover, numerical experiments are provided to validate the effciency of our algorithms.

Download PDF

Recommended citation:
Hao, Luo. (2024). "Effcient primal-dual algorithms for optimal transport-like problems " Technical Report.
Download Slides

First-Order Methods in Convex Optimization: From Discrete to Continuous and Vice-versa

Published:

Recently, continuous-time approach has been widely used to study first-order methods (FOMs) for structured convex optimization problems, such as Nesterov accelerated gradient (NAG), alternating method of multiplies (ADMM) and primal-dual hybrid gradient (PDHG) From the continuous point of view, the discrete iterative sequences actually correspond to the trajectories of some ordinary differential equations (ODEs), and the use of Inertia/Momentum and Gradient Correction usually leads to second-order information in time and space. In this talk, we shall present a unified ODE2OPT framework, which gives a systematic way for deriving the continuous-time ODE of a given FOM and provides also an alternative way for designing and analyzing FOMs by using the tool of Lyapunov functional and numerical analysis.

Recommended citation:
Hao, Luo. (2024). "First-Order Methods in Convex Optimization: From Discrete to Continuous and Vice-versa " Technical Report.
Download Slides

teaching