Sousedik, BedrichGalambos, Emoke2021-09-012021-09-012019-01-0112114http://hdl.handle.net/11603/22864We investigate the effects of preconditioning on the convergence of the L-BFGS method. Our goal is to find a global minimum of a non-convex, differentiable function, using non-preconditioned and linearly preconditioned L-BFGS algorithms. The objective function is a 6-dimensional variation of the Bohachevsky N.1 benchmark function, with a dominant convex part. We discuss some numerical instability issues caused by ill-conditioned systems and the non-convexity of the objective function. We also introduce a new algorithm, which combines the preconditioned and non-preconditioned L-BFGS algorithms with the Cat Swarm Optimization Algorithm. The implemented algorithm solves the numerical instability issues and complements the optimization problem with a randomized global search. The results will show the improved performance of the algorithm when used with preconditioning.application:pdfPreconditioning in Large-Scale Unconstrained Optimization ProblemsText