skip to main content

Variants of the A-HPE and large-step A-HPE algorithms for strongly convex problems with applications to accelerated high-order tensor methods

Marques Alves, M.

Optimization methods & software, 2022-11, Vol.37 (6), p.2021-2051 [Periódico revisado por pares]

Abingdon: Taylor & Francis

Texto completo disponível

Citações Citado por
  • Título:
    Variants of the A-HPE and large-step A-HPE algorithms for strongly convex problems with applications to accelerated high-order tensor methods
  • Autor: Marques Alves, M.
  • Assuntos: accelerated methods ; Algorithms ; Computational geometry ; Convergence ; Convex optimization ; Convexity ; high-order tensor methods ; large-step ; Mathematical analysis ; Optimization ; proximal-Newton method ; proximal-point algorithm ; strongly convex ; superlinear convergence ; Tensors
  • É parte de: Optimization methods & software, 2022-11, Vol.37 (6), p.2021-2051
  • Descrição: For solving strongly convex optimization problems, we propose and study the global convergence of variants of the accelerated hybrid proximal extragradient (A-HPE) and large-step A-HPE algorithms of R.D.C. Monteiro and B.F. Svaiter [An accelerated hybrid proximal extragradient method for convex optimization and its implications to second-order methods, SIAM J. Optim. 23 (2013), pp. 1092-1125.]. We prove linear and the superlinear global rates for the proposed variants of the A-HPE and large-step A-HPE methods, respectively. The parameter appears in the (high-order) large-step condition of the new large-step A-HPE algorithm. We apply our results to high-order tensor methods, obtaining a new inexact (relative-error) tensor method for (smooth) strongly convex optimization with iteration-complexity . In particular, for p = 2, we obtain an inexact proximal-Newton algorithm with fast global convergence rate.
  • Editor: Abingdon: Taylor & Francis
  • Idioma: Inglês

Buscando em bases de dados remotas. Favor aguardar.