National Repository of Grey Literature 35 records found  1 - 10nextend  jump to record: Search took 0.02 seconds. 
Matematické programování
Lukšan, Ladislav
Fulltext: content.csg - Download fulltextPDF
Plný tet: v1043-08 - Download fulltextPDF
Metody s proměnnou metrikou s omezenou pamětí, založené na invariantních maticích
Vlček, Jan ; Lukšan, Ladislav
A new class of limited-memory variable metric methods for unconstrained minimization is described. Approximations of inverses of Hessian matrices are based on matrices which are invariant with respect to a linear transformation. As these matrices are singular, they are adjusted for a computation of direction vectors. The methods have the quadratic termination property, which means that they will find a minimum of a strict quadratic function with an exact choice of a step-length after a finite number of steps. Numerical experiments show the efficiency of this method.
O Lagrangeových multiplikátorech v metodách s lokálně omezeným krokem
Lukšan, Ladislav ; Matonoha, Ctirad ; Vlček, Jan
Trust-region methods are globally convergent techniques widely used, for example, in connection with the Newton's method for unconstrained optimization. One of the most commonly-used iterative approaches for solving the trust-region subproblems is the Steihaug-Toint method which is based on conjugate gradient iterations and seeks a solution on Krylov subspaces. The paper contains new theoretical results concerning properties of Lagrange multipliers obtained on these subspaces.
Metody vnitřních bodů pro zobecněnou minimaxovou optimalizaci
Lukšan, Ladislav ; Matonoha, Ctirad ; Vlček, Jan
A new class of primal interior point methods for generalized minimax optimization is described. These methods use besides a standard logarithmic barrier function also barrier functions bounded from below which have more favourable properties for investigation of global convergence. It deals with descent direction methods, where an approxmation of the Hessian matrix is computed by gradient differences or quasi-Newton updates. Two-level optimization is used. A direction vector is computed by a Choleski decompostition of a sparse matrix. Numerical experiments concerning two basic applications, minimization of a point maximum and a sum of absolute values of smooth functions, are presented.

National Repository of Grey Literature : 35 records found   1 - 10nextend  jump to record:
Interested in being notified about new results for this query?
Subscribe to the RSS feed.