We also derive recurrance relations that facilitate the efficient implementation of a class of inertia-controlling methods that maintain the factorization of a nonsingular matrix associated with the Karush-Kuhn-Tucker conditions."
Optimization is an important tool used in decision science and for the analysis of physical systems used in engineering. One can trace its roots to the Calculus of Variations and the work of Euler and Lagrange. This natural and reasonable approach to mathematical programming covers numerical methods for finite-dimensional optimization problems. It begins with very simple ideas progressing through more complicated concepts, concentrating on methods for both unconstrained and constrained optimization.
Many engineering, operations, and scientific applications include a mixture of discrete and continuous decision variables and nonlinear relationships involving the decision variables that have a pronounced effect on the set of feasible and optimal solutions. Mixed-integer nonlinear programming (MINLP) problems combine the numerical difficulties of handling nonlinear functions with the challenge of optimizing in the context of nonconvex functions and discrete variables. MINLP is one of the most flexible modeling paradigms available for optimization; but because its scope is so broad, in the most general cases it is hopelessly intractable. Nonetheless, an expanding body of researchers and practitioners — including chemical engineers, operations researchers, industrial engineers, mechanical engineers, economists, statisticians, computer scientists, operations managers, and mathematical programmers — are interested in solving large-scale MINLP instances.
Computational Issues in High Performance Software for Nonlinear Research brings together in one place important contributions and up-to-date research results in this important area. Computational Issues in High Performance Software for Nonlinear Research serves as an excellent reference, providing insight into some of the most important research issues in the field.
After a review of historical developments in convergence analysis for Newton's and Newton-like methods, 18 papers deal in depth with various classical, or neo-classical approaches, as well as newer ideas on optimization and solving linear equations. A sampling of topics: truncated Newton methods, sequential quadratic programming for large- scale nonlinear optimization, and automatic differentiation of algorithms. This monograph, one of seven volumes in the set, is also published as the Journal of Computational and Applied Mathematics; v.124 (2000). Indexed only by author. c. Book News Inc.