|Computational Mathematics Seminar Series|
|An Adaptive Preconditioned Nonlinear Conjugate Gradient Method with Limited Memory|
|Hongchao Zhang, LSU|
|Assistant Professor, Department of Mathematics|
|Johnston Hall 338
November 29, 2011 - 03:30 pm
Nonlinear conjugate gradient methods are an important class of methods for solving large-scale unconstrained nonlinear optimization. However, their performance is often severely affected when the problem is very ill-conditioned. In the talk, efficient techniques for adaptively preconditioning the nonlinear conjugate method in the subspace spanned by a small number of previous searching directions will be discussed. The new method could take advantages of both nonlinear conjugate methods and limited-memory BFGS quasi-Newton methods, and achieves significant performance improvement compared with CG\_DESCENT conjugate gradient method and L-BFGS quasi-Newton method.
Hongchao Zhang is an Assistant Professor jointly in the Mathematics Department and the Center for Computation & Technology (CCT) at Louisiana State University. He received a Ph.D. degree in the Department of Mathematics from the University of Florida. His research interests are nonlinear optimization theory, algorithms and application.
|Refreshments will be served.|
|This lecture has a reception.|