National Repository of Grey Literature 133 records found  1 - 10nextend  jump to record: Search took 0.00 seconds. 
Hybrid Methods for Nonlinear Least Squares Problems
Lukšan, Ladislav ; Matonoha, Ctirad ; Vlček, Jan
This contribution contains a description and analysis of effective methods for minimization of the nonlinear least squares function F(x) = (1=2)fT (x)f(x), where x ∈ Rn and f ∈ Rm, together with extensive computational tests and comparisons of the introduced methods. All hybrid methods are described in detail and their global convergence is proved in a unified way. Some proofs concerning trust region methods, which are difficult to find in the literature, are also added. In particular, the report contains an analysis of a new simple hybrid method with Jacobian corrections (Section 8) and an investigation of the simple hybrid method for sparse least squares problems proposed previously in [33] (Section 14).
Fulltext: Download fulltextPDF
Application of the Infinitely Many Times Repeated BNS Update and Conjugate Directions to Limited-Memory Optimization Methods
Vlček, Jan ; Lukšan, Ladislav
To improve the performance of the L-BFGS method for large scale unconstrained optimization, repeating of some BFGS updates was proposed. Since this can be time consuming, the extra updates need to be selected carefully. We show that groups of these updates can be repeated infinitely many times under some conditions, without a noticeable increase of the computational time; the limit update is a block BFGS update. It can be obtained by solving of some Lyapunov matrix equation whose order can be decreased by application of vector corrections for conjugacy. Global convergence of the proposed algorithm is established for convex and sufficiently smooth functions. Numerical results indicate the efficiency of the new method.
A Hybrid Method for Nonlinear Least Squares that Uses Quasi-Newton Updates Applied to an Approximation of the Jacobian Matrix
Lukšan, Ladislav ; Vlček, Jan
In this contribution, we propose a new hybrid method for minimization of nonlinear least squares. This method is based on quasi-Newton updates, applied to an approximation A of the Jacobian matrix J, such that AT f = JT f. This property allows us to solve a linear least squares problem, minimizing ∥Ad+f∥ instead of solving the normal equation ATAd+JT f = 0, where d ∈ Rn is the required direction vector. Computational experiments confirm the efficiency of the new method.
Problems for Nonlinear Least Squares and Nonlinear Equations
Lukšan, Ladislav ; Matonoha, Ctirad ; Vlček, Jan
This report contains a description of subroutines which can be used for testing large-scale optimization codes. These subroutines can easily be obtained from the web page http://www.cs.cas.cz/~luksan/test.html. Furthermore, all test problems contained in these subroutines are presented in the analytic form.
Fulltext: Download fulltextPDF
Sparse Test Problems for Nonlinear Least Squares
Lukšan, Ladislav ; Matonoha, Ctirad ; Vlček, Jan
This report contains a description of subroutines which can be used for testing large-scale optimization codes. These subroutines can easily be obtained from the web page http://www.cs.cas.cz/~luksan/test.html. Furthermore, all test problems contained in these subroutines are presented in the analytic form.
Fulltext: Download fulltextPDF
UFO 2017. Interactive System for Universal Functional Optimization
Lukšan, Ladislav ; Tůma, Miroslav ; Matonoha, Ctirad ; Vlček, Jan ; Ramešová, Nina ; Šiška, M. ; Hartman, J.
This report contains a description of the interactive system for universal functional optimization UFO, version 2017. This version contains interfaces to the MATLAB and SCILAB graphics environments.
A Generalized Limited-Memory BNS Method Based on the Block BFGS Update
Vlček, Jan ; Lukšan, Ladislav
A block version of the BFGS variable metric update formula is investigated. It satisfies the quasi-Newton conditions with all used difference vectors and gives the best improvement of convergence in some sense for quadratic objective functions, but it does not guarantee that the direction vectors are descent for general functions. To overcome this difficulty and utilize the advantageous properties of the block BFGS update, a block version of the limited-memory BNS method for large scale unconstrained optimization is proposed. The algorithm is globally convergent for convex sufficiently smooth functions and our numerical experiments indicate its efficiency.

National Repository of Grey Literature : 133 records found   1 - 10nextend  jump to record:
Interested in being notified about new results for this query?
Subscribe to the RSS feed.