Národní úložiště šedé literatury Nalezeno 24 záznamů.  1 - 10dalšíkonec  přejít na záznam: Hledání trvalo 0.01 vteřin. 
Some modifications of the limited-memory variable metric optimization methods
Vlček, Jan ; Lukšan, Ladislav
Several modifications of the limited-memory variable metric (or quasi-Newton) line search methods for large scale unconstrained optimization are investigated. First the block version of the symmetric rank-one (SR1) update formula is derived in a similar way as for the block BFGS update in Vlˇcek and Lukˇsan (Numerical Algorithms 2019). The block SR1 formula is then modified to obtain an update which can reduce the required number of arithmetic operations per iteration. Since it usually violates the corresponding secant conditions, this update is combined with the shifting investigated in Vlˇcek and Lukˇsan (J. Comput. Appl. Math. 2006). Moreover, a new efficient way how to realize the limited-memory shifted BFGS method is proposed. For a class of methods based on the generalized shifted economy BFGS update, global convergence is established. A numerical comparison with the standard L-BFGS and BNS methods is given.
Plný tet: Stáhnout plný textPDF
Two limited-memory optimization methods with minimum violation of the previous quasi-Newton equations
Vlček, Jan ; Lukšan, Ladislav
Limited-memory variable metric methods based on the well-known BFGS update are widely used for large scale optimization. The block version of the BFGS update, derived by Schnabel (1983), Hu and Storey (1991) and Vlček and Lukšan (2019), satisfies the quasi-Newton equations with all used difference vectors and for quadratic objective functions gives the best improvement of convergence in some sense, but the corresponding direction vectors are not descent directions generally. To guarantee the descent property of direction vectors and simultaneously violate the quasi-Newton equations as little as possible in some sense, two methods based on the block BFGS update are proposed. They can be advantageously combined with methods based on vector corrections for conjugacy (Vlček and Lukšan, 2015). Global convergence of the proposed algorithm is established for convex and sufficiently smooth functions. Numerical experiments demonstrate the efficiency of the new methods.
Plný tet: Stáhnout plný textPDF
Application of the Infinitely Many Times Repeated BNS Update and Conjugate Directions to Limited-Memory Optimization Methods
Vlček, Jan ; Lukšan, Ladislav
To improve the performance of the L-BFGS method for large scale unconstrained optimization, repeating of some BFGS updates was proposed. Since this can be time consuming, the extra updates need to be selected carefully. We show that groups of these updates can be repeated infinitely many times under some conditions, without a noticeable increase of the computational time. The limit update is a block BFGS update. It can be obtained by solving of some Lyapunov matrix equation whose order can be decreased by application of vector corrections for conjugacy. Global convergence of the proposed algorithm is established for convex and sufficiently smooth functions. Numerical results indicate the efficiency of the new method.
A Generalized Limited-Memory BNS Method Based on the Block BFGS Update
Vlček, Jan ; Lukšan, Ladislav
A block version of the BFGS variable metric update formula is investigated. It satisfies the quasi-Newton conditions with all used difference vectors and gives the best improvement of convergence in some sense for quadratic objective functions, but it does not guarantee that the direction vectors are descent for general functions. To overcome this difficulty and utilize the advantageous properties of the block BFGS update, a block version of the limited-memory BNS method for large scale unconstrained optimization is proposed. The algorithm is globally convergent for convex sufficiently smooth functions and our numerical experiments indicate its efficiency.
Metody vnitřních bodů pro zobecněnou minimaxovou optimalizaci
Lukšan, Ladislav ; Matonoha, Ctirad ; Vlček, Jan
Je popsána nová třída primárních metod vnitřních bodů pro zobecněnou minimaxovou optimalizaci. Tyto metody používají kromě standardní logaritmické barierové funkce též zdola omezené barierové funkce, které mají příznivější vlastnosti pro vyšetřování globální konvergence. Jde o metody spádových směrů, kde se aproximace Hessovy matice počítá buď pomocí diferencí gradientů nebo pomocí kvazinewtonovských aktualizací. Používá se dvojúrovňová optimalizace. Směrový vektor se počítá pomocí Choleského rozkladu řídké matice. Jsou uvedeny numerické experimenty týkající se dvou základních aplikací, minimalizace bodového maxima a součtu absolutních hodnot hladkých funkcí.

Národní úložiště šedé literatury : Nalezeno 24 záznamů.   1 - 10dalšíkonec  přejít na záznam:
Chcete být upozorněni, pokud se objeví nové záznamy odpovídající tomuto dotazu?
Přihlásit se k odběru RSS.