Original title: Two limited-memory optimization methods with minimum violation of the previous quasi-Newton equations
Authors: Vlček, Jan ; Lukšan, Ladislav
Document type: Research reports
Year: 2020
Language: eng
Series: Technical Report, volume: V-1280
Abstract: Limited-memory variable metric methods based on the well-known BFGS update are widely used for large scale optimization. The block version of the BFGS update, derived by Schnabel (1983), Hu and Storey (1991) and Vlček and Lukšan (2019), satisfies the quasi-Newton equations with all used difference vectors and for quadratic objective functions gives the best improvement of convergence in some sense, but the corresponding direction vectors are not descent directions generally. To guarantee the descent property of direction vectors and simultaneously violate the quasi-Newton equations as little as possible in some sense, two methods based on the block BFGS update are proposed. They can be advantageously combined with methods based on vector corrections for conjugacy (Vlček and Lukšan, 2015). Global convergence of the proposed algorithm is established for convex and sufficiently smooth functions. Numerical experiments demonstrate the efficiency of the new methods.
Keywords: global convergence; limited-memory methods; numerical results; unconstrained minimization; variable metric methods; variationally derived methods
Rights: This work is protected under the Copyright Act No. 121/2000 Coll.

Institution: Institute of Computer Science AS ČR (web)
Original record: http://hdl.handle.net/11104/0310865

Permalink: http://www.nusl.cz/ntk/nusl-432144

The record appears in these collections:
Research > Institutes ASCR > Institute of Computer Science
Reports > Research reports
 Record created 2020-12-03, last modified 2021-03-28

Plný tet:
If you can´t see the document in your browser, save it to your PC and open it in a suitable application.
  • Export as DC, NUŠL, RIS
  • Share