Recent interest in interior point methods generated by Karmarkar's Projective Scaling Algorithm has created a new demand for this book because the methods that have followed from Karmarkar's bear a close resemblance to those described. There is no other source for the theoretical background of the logarithmic barrier function and other classical penalty functions. Analyzes in detail the "central" or "dual" trajectory used by modern path following and primal/dual methods for convex and general linear programming. As researchers begin to extend these methods to convex and general nonlinear programming problems, this book will become indispensable to them.
In its thousands of years of history, mathematics has made an extraordinary ca reer. It started from rules for bookkeeping and computation of areas to become the language of science. Its potential for decision support was fully recognized in the twentieth century only, vitally aided by the evolution of computing and communi cation technology. Mathematical optimization, in particular, has developed into a powerful machinery to help planners. Whether costs are to be reduced, profits to be maximized, or scarce resources to be used wisely, optimization methods are available to guide decision making. Opti mization is particularly strong if precise models of real phenomena and data of high quality are at hand - often yielding reliable automated control and decision proce dures. But what, if the models are soft and not all data are around? Can mathematics help as well? This book addresses such issues, e. g. , problems of the following type: - An elevator cannot know all transportation requests in advance. In which order should it serve the passengers? - Wing profiles of aircrafts influence the fuel consumption. Is it possible to con tinuously adapt the shape of a wing during the flight under rapidly changing conditions? - Robots are designed to accomplish specific tasks as efficiently as possible. But what if a robot navigates in an unknown environment? - Energy demand changes quickly and is not easily predictable over time. Some types of power plants can only react slowly.
Optimization is one of the most important areas of modern applied mathematics, with applications in fields from engineering and economics to finance, statistics, management science, and medicine. While many books have addressed its various aspects, Nonlinear Optimization is the first comprehensive treatment that will allow graduate students and researchers to understand its modern ideas, principles, and methods within a reasonable time, but without sacrificing mathematical precision. Andrzej Ruszczynski, a leading expert in the optimization of nonlinear stochastic systems, integrates the theory and the methods of nonlinear optimization in a unified, clear, and mathematically rigorous fashion, with detailed and easy-to-follow proofs illustrated by numerous examples and figures. The book covers convex analysis, the theory of optimality conditions, duality theory, and numerical methods for solving unconstrained and constrained optimization problems. It addresses not only classical material but also modern topics such as optimality conditions and numerical methods for problems involving nondifferentiable functions, semidefinite programming, metric regularity and stability theory of set-constrained systems, and sensitivity analysis of optimization problems. Based on a decade's worth of notes the author compiled in successfully teaching the subject, this book will help readers to understand the mathematical foundations of the modern theory and methods of nonlinear optimization and to analyze new problems, develop optimality theory for them, and choose or construct numerical solution methods. It is a must for anyone seriously interested in optimization.
This introductory textbook adopts a practical and intuitive approach, rather than emphasizing mathematical rigor. Computationally oriented books in this area generally present algorithms alone, and expect readers to perform computations by hand, and are often written in traditional computer languages, such as Basic, Fortran or Pascal. This book, on the other hand, is the first text to use Mathematica to develop a thorough understanding of optimization algorithms, fully exploiting Mathematica's symbolic, numerical and graphic capabilities.
Optimality and stability are two important notions in applied mathematics. This book is a study of these notions and their relationship in linear and convex parametric programming models. It begins with a survey of basic optimality conditions in nonlinear programming. Then new results in convex programming, using LFS functions, for single-objective, multi-objective, differentiable and non-smooth programs are introduced. Parametric programming models are studied using basic tools of point-to-set topology. Stability of the models is introduced, essentially, as continuity of the feasible set of decision variables under continuous perturbations of the parameters. Perturbations that preserve this continuity are regions of stability. It is shown how these regions can be identified. The main results on stability are characterizations of locally and globally optimal parameters for stable and also for unstable perturbations. The results are straightened for linear models and bi-level programs. Some of the results are extended to abstract spaces after considering parameters as `controls'. Illustrations from diverse fields, such as data envelopment analysis, management, von Stackelberg games of market economy, and navigation problems are given and several case studies are solved by finding optimal parameters. The book has been written in an analytic spirit. Many results appear here for the first time in book form. Audience: The book is written at the level of a first-year graduate course in optimization for students with varied backgrounds interested in modeling of real-life problems. It is expected that the reader has been exposed to a prior elementary course in optimization, such as linear or non-linear programming. The last section of the book requires some knowledge of functional analysis.
The goal of the Encyclopedia of Optimization is to introduce the reader to a complete set of topics that show the spectrum of research, the richness of ideas, and the breadth of applications that has come from this field. The second edition builds on the success of the former edition with more than 150 completely new entries, designed to ensure that the reference addresses recent areas where optimization theories and techniques have advanced. Particularly heavy attention resulted in health science and transportation, with entries such as "Algorithms for Genomics", "Optimization and Radiotherapy Treatment Design", and "Crew Scheduling".
This is a book about infrastructure networks that are intrinsically nonlinear. The networks considered range from vehicular networks to electric power networks to data networks. The main point of view taken is that of mathematical programming in concert with finite-dimensional variational inequality theory. The principle modeling perspectives are network optimization, the theory of Nash games, and mathematical programming with equilibrium constraints. Computational methods and novel mathematical formulations are emphasized. Among the numerical methods explored are network simplex, gradient projection, fixed-point, gap function, Lagrangian relaxation, Dantzig-Wolfe decomposition, simplicial decomposition, and computational intelligence algorithms. Many solved example problems are included that range from simple to quite challenging. Theoretical analyses of several models and algorithms, to uncover existence, uniqueness and convergence properties, are undertaken. The book is meant for use in advanced undergraduate as well as doctoral courses taught in civil engineering, industrial engineering, systems engineering, and operations research degree programs. At the same time, the book should be a useful resource for industrial and university researchers engaged in the mathematical modeling and numerical analyses of infrastructure networks.
A large number of mathematical models in many diverse areas of science and engineering have lead to the formulation of optimization problems where the best solution (globally optimal) is needed. This book covers a small subset of important topics in global optimization with emphasis on theoretical developments and scientific applications.