A large number of mathematical models in many diverse areas of science and engineering have lead to the formulation of optimization problems where the best solution (globally optimal) is needed. This book covers a small subset of important topics in global optimization with emphasis on theoretical developments and scientific applications.
Significant research activity has occurred in the area of global optimization in recent years. Many new theoretical, algorithmic, and computational contributions have resulted. Despite the major importance of test problems for researchers, there has been a lack of representative nonconvex test problems for constrained global optimization algorithms. This book is motivated by the scarcity of global optimization test problems and represents the first systematic collection of test problems for evaluating and testing constrained global optimization algorithms. This collection includes problems arising in a variety of engineering applications, and test problems from published computational reports.
This book provides a comprehensive, modern introduction to convex optimization, a field that is becoming increasingly important in applied mathematics, economics and finance, engineering, and computer science, notably in data science and machine learning. Written by a leading expert in the field, this book includes recent advances in the algorithmic theory of convex optimization, naturally complementing the existing literature. It contains a unified and rigorous presentation of the acceleration techniques for minimization schemes of first- and second-order. It provides readers with a full treatment of the smoothing technique, which has tremendously extended the abilities of gradient-type methods. Several powerful approaches in structural optimization, including optimization in relative scale and polynomial-time interior-point methods, are also discussed in detail. Researchers in theoretical optimization as well as professionals working on optimization problems will find this book very useful. It presents many successful examples of how to develop very fast specialized minimization algorithms. Based on the author’s lectures, it can naturally serve as the basis for introductory and advanced courses in convex optimization for students in engineering, economics, computer science and mathematics.
This book provides a unified and insightful treatment of deterministic global optimization. It introduces theoretical and algorithmic advances that address the computation and characterization of global optima, determine valid lower and upper bounds on the global minima and maxima, and enclose all solutions of nonlinear constrained systems of equations. Among its special features, the book: Introduces the fundamentals of deterministic global optimization; Provides a thorough treatment of decomposition-based global optimization approaches for biconvex and bilinear problems; Covers global optimization methods for generalized geometric programming problems Presents in-depth global optimization algorithms for general twice continuously differentiable nonlinear problems; Provides a detailed treatment of global optimization methods for mixed-integer nonlinear problems; Develops global optimization approaches for the enclosure of all solutions of nonlinear constrained systems of equations; Includes many important applications from process design, synthesis, control, and operations, phase equilibrium, design under uncertainty, parameter estimation, azeotrope prediction, structure prediction in clusters and molecules, protein folding, and peptide docking. Audience: This book can be used as a textbook in graduate-level courses and as a desk reference for researchers in all branches of engineering and applied science, applied mathematics, industrial engineering, operations research, computer science, economics, computational chemistry and molecular biology.
Due to the general complementary convex structure underlying most nonconvex optimization problems encountered in applications, convex analysis plays an essential role in the development of global optimization methods. This book develops a coherent and rigorous theory of deterministic global optimization from this point of view. Part I constitutes an introduction to convex analysis, with an emphasis on concepts, properties and results particularly needed for global optimization, including those pertaining to the complementary convex structure. Part II presents the foundation and application of global search principles such as partitioning and cutting, outer and inner approximation, and decomposition to general global optimization problems and to problems with a low-rank nonconvex structure as well as quadratic problems. Much new material is offered, aside from a rigorous mathematical development. Audience: The book is written as a text for graduate students in engineering, mathematics, operations research, computer science and other disciplines dealing with optimization theory. It is also addressed to all scientists in various fields who are interested in mathematical optimization.
Here is a book devoted to well-structured and thus efficiently solvable convex optimization problems, with emphasis on conic quadratic and semidefinite programming. The authors present the basic theory underlying these problems as well as their numerous applications in engineering, including synthesis of filters, Lyapunov stability analysis, and structural design. The authors also discuss the complexity issues and provide an overview of the basic theory of state-of-the-art polynomial time interior point methods for linear, conic quadratic, and semidefinite programming. The book's focus on well-structured convex problems in conic form allows for unified theoretical and algorithmical treatment of a wide spectrum of important optimization problems arising in applications.
This book constitutes the thoroughly refereed post-conference proceedings of the 12th International Conference on Learning and Intelligent Optimization, LION 12, held in Kalamata, Greece, in June 2018. The 28 full papers and 12 short papers presented have been carefully reviewed and selected from 62 submissions. The papers explore the advanced research developments in such interconnected fields as mathematical programming, global optimization, machine learning, and artificial intelligence. Special focus is given to advanced ideas, technologies, methods, and applications in optimization and machine learning.
Stochastic global optimization methods and applications to chemical, biochemical, pharmaceutical and environmental processes presents various algorithms that include the genetic algorithm, simulated annealing, differential evolution, ant colony optimization, tabu search, particle swarm optimization, artificial bee colony optimization, and cuckoo search algorithm. The design and analysis of these algorithms is studied by applying them to solve various base case and complex optimization problems concerning chemical, biochemical, pharmaceutical, and environmental engineering processes. Design and implementation of various classical and advanced optimization strategies to solve a wide variety of optimization problems makes this book beneficial to graduate students, researchers, and practicing engineers working in multiple domains. This book mainly focuses on stochastic, evolutionary, and artificial intelligence optimization algorithms with a special emphasis on their design, analysis, and implementation to solve complex optimization problems and includes a number of real applications concerning chemical, biochemical, pharmaceutical, and environmental engineering processes.
It was in the middle of the 1980s, when the seminal paper by Kar markar opened a new epoch in nonlinear optimization. The importance of this paper, containing a new polynomial-time algorithm for linear op timization problems, was not only in its complexity bound. At that time, the most surprising feature of this algorithm was that the theoretical pre diction of its high efficiency was supported by excellent computational results. This unusual fact dramatically changed the style and direc tions of the research in nonlinear optimization. Thereafter it became more and more common that the new methods were provided with a complexity analysis, which was considered a better justification of their efficiency than computational experiments. In a new rapidly develop ing field, which got the name "polynomial-time interior-point methods", such a justification was obligatory. Afteralmost fifteen years of intensive research, the main results of this development started to appear in monographs [12, 14, 16, 17, 18, 19]. Approximately at that time the author was asked to prepare a new course on nonlinear optimization for graduate students. The idea was to create a course which would reflect the new developments in the field. Actually, this was a major challenge. At the time only the theory of interior-point methods for linear optimization was polished enough to be explained to students. The general theory of self-concordant functions had appeared in print only once in the form of research monograph [12].