National Repository of Grey Literature 28 records found  1 - 10nextend  jump to record: Search took 0.00 seconds. 
Two limited-memory optimization methods with minimum violation of the previous quasi-Newton equations
Vlček, Jan ; Lukšan, Ladislav
Limited-memory variable metric methods based on the well-known BFGS update are widely used for large scale optimization. The block version of the BFGS update, derived by Schnabel (1983), Hu and Storey (1991) and Vlček and Lukšan (2019), satisfies the quasi-Newton equations with all used difference vectors and for quadratic objective functions gives the best improvement of convergence in some sense, but the corresponding direction vectors are not descent directions generally. To guarantee the descent property of direction vectors and simultaneously violate the quasi-Newton equations as little as possible in some sense, two methods based on the block BFGS update are proposed. They can be advantageously combined with methods based on vector corrections for conjugacy (Vlček and Lukšan, 2015). Global convergence of the proposed algorithm is established for convex and sufficiently smooth functions. Numerical experiments demonstrate the efficiency of the new methods.
Plný tet: Download fulltextPDF
Modifications of the limited-memory BFGS method based on the idea of conjugate directions
Vlček, Jan ; Lukšan, Ladislav
Simple modifications of the limited-memory BFGS method (L-BFGS) for large scale unconstrained optimization are considered, which consist in corrections of the used difference vectors (derived from the idea of conjugate directions), utilizing information from the preceding iteration. For quadratic objective functions, the improvement of convergence is the best one in some sense and all stored difference vectors are conjugate for unit stepsizes. The algorithm is globally convergent for convex sufficiently smooth functions. Numerical experiments indicate that the new method often improves the L-BFGS method significantly.
Modifications of the limited-memory BNS method for better satisfaction of previous quasi-Newton conditions
Vlček, Jan ; Lukšan, Ladislav
Several modifications of the limited-memory variable metric BNS method for large scale un- constrained optimization are proposed, which consist in corrections (derived from the idea of conjugate directions) of the used difference vectors to improve satisfaction of previous quasi-Newton conditions, utilizing information from previous or subsequent iterations. In case of quadratic objective functions, conjugacy of all stored diffrence vectors and satisfaction of quasi-Newton conditions with these vectors is established. There are many possibilities how to realize this approach and although only two methods were implemented and tested, preliminary numerical results are promising.
Fulltext: Download fulltextPDF

National Repository of Grey Literature : 28 records found   1 - 10nextend  jump to record:
Interested in being notified about new results for this query?
Subscribe to the RSS feed.