This volume covers both classical results and the most recent theoretical developments in the field of randomized search heuristics such as runtime analysis, drift analysis and convergence.
This edited book reports on recent developments in the theory of evolutionary computation, or more generally the domain of randomized search heuristics. It starts with two chapters on mathematical methods that are often used in the analysis of randomized search heuristics, followed by three chapters on how to measure the complexity of a search heuristic: black-box complexity, a counterpart of classical complexity theory in black-box optimization; parameterized complexity, aimed at a more fine-grained view of the difficulty of problems; and the fixed-budget perspective, which answers the question of how good a solution will be after investing a certain computational budget. The book then describes theoretical results on three important questions in evolutionary computation: how to profit from changing the parameters during the run of an algorithm; how evolutionary algorithms cope with dynamically changing or stochastic environments; and how population diversity influences performance. Finally, the book looks at three algorithm classes that have only recently become the focus of theoretical work: estimation-of-distribution algorithms; artificial immune systems; and genetic programming. Throughout the book the contributing authors try to develop an understanding for how these methods work, and why they are so successful in many applications. The book will be useful for students and researchers in theoretical computer science and evolutionary computing.
This book constitutes the refereed proceedings of the 28th International Symposium on Mathematical Foundations of Computer Science, MFCS 2003, held in Bratislava, Slovakia in August 2003. The 55 revised full papers presented together with 7 invited papers were carefully reviewed and selected from 137 submissions. All current aspects in theoretical computer science are addressed, ranging from discrete mathematics, combinatorial optimization, graph theory, networking, algorithms, and complexity to programming theory, formal methods, and mathematical logic.
Search has been vital to artificial intelligence from the very beginning as a core technique in problem solving. The authors present a thorough overview of heuristic search with a balance of discussion between theoretical analysis and efficient implementation and application to real-world problems. Current developments in search such as pattern databases and search with efficient use of external memory and parallel processing units on main boards and graphics cards are detailed. Heuristic search as a problem solving tool is demonstrated in applications for puzzle solving, game playing, constraint satisfaction and machine learning. While no previous familiarity with heuristic search is necessary the reader should have a basic knowledge of algorithms, data structures, and calculus. Real-world case studies and chapter ending exercises help to create a full and realized picture of how search fits into the world of artificial intelligence and the one around us. - Provides real-world success stories and case studies for heuristic search algorithms - Includes many AI developments not yet covered in textbooks such as pattern databases, symbolic search, and parallel processing units
Meta-heuristics have developed dramatically since their inception in the early 1980s. They have had widespread success in attacking a variety of practical and difficult combinatorial optimization problems. These families of approaches include, but are not limited to greedy random adaptive search procedures, genetic algorithms, problem-space search, neural networks, simulated annealing, tabu search, threshold algorithms, and their hybrids. They incorporate concepts based on biological evolution, intelligent problem solving, mathematical and physical sciences, nervous systems, and statistical mechanics. Since the 1980s, a great deal of effort has been invested in the field of combinatorial optimization theory in which heuristic algorithms have become an important area of research and applications. This volume is drawn from the first conference on Meta-Heuristics and contains 41 papers on the state-of-the-art in heuristic theory and applications. The book treats the following meta-heuristics and applications: Genetic Algorithms, Simulated Annealing, Tabu Search, Networks & Graphs, Scheduling and Control, TSP, and Vehicle Routing Problems. It represents research from the fields of Operations Research, Management Science, Artificial Intelligence and Computer Science.
This book constitutes the refereed proceedings of the 28th International Colloquium on Automata, Languages and Programming, ICALP 2001, held in Crete, Greece in July 2001. The 80 revised papers presented together with two keynote contributions and four invited papers were carefully reviewed and selected from a total of 208 submissions. The papers are organized in topical sections on algebraic and circuit complexity, algorithm analysis, approximation and optimization, complexity, concurrency, efficient data structures, graph algorithms, language theory, codes and automata, model checking and protocol analysis, networks and routing, reasoning and verification, scheduling, secure computation, specification and deduction, and structural complexity.
This book constitutes the proceedings of the 6th International Computer Science Symposium in Russia, CSR 2011, held in St. Petersburg, Russia, in June 2011. The 29 papers presented were carefully reviewed and selected from 76 submissions. The scope of topics of the symposium was quite broad and covered basically all areas of the foundations of theoretical computer science.
Interested in the Genetic Algorithm? Simulated Annealing? Ant Colony Optimization? Essentials of Metaheuristics covers these and other metaheuristics algorithms, and is intended for undergraduate students, programmers, and non-experts. The book covers a wide range of algorithms, representations, selection and modification operators, and related topics, and includes 71 figures and 135 algorithms great and small. Algorithms include: Gradient Ascent techniques, Hill-Climbing variants, Simulated Annealing, Tabu Search variants, Iterated Local Search, Evolution Strategies, the Genetic Algorithm, the Steady-State Genetic Algorithm, Differential Evolution, Particle Swarm Optimization, Genetic Programming variants, One- and Two-Population Competitive Coevolution, N-Population Cooperative Coevolution, Implicit Fitness Sharing, Deterministic Crowding, NSGA-II, SPEA2, GRASP, Ant Colony Optimization variants, Guided Local Search, LEM, PBIL, UMDA, cGA, BOA, SAMUEL, ZCS, XCS, and XCSF.
This is the first book to cover GRASP (Greedy Randomized Adaptive Search Procedures), a metaheuristic that has enjoyed wide success in practice with a broad range of applications to real-world combinatorial optimization problems. The state-of-the-art coverage and carefully crafted pedagogical style lends this book highly accessible as an introductory text not only to GRASP, but also to combinatorial optimization, greedy algorithms, local search, and path-relinking, as well as to heuristics and metaheuristics, in general. The focus is on algorithmic and computational aspects of applied optimization with GRASP with emphasis given to the end-user, providing sufficient information on the broad spectrum of advances in applied optimization with GRASP. For the more advanced reader, chapters on hybridization with path-relinking and parallel and continuous GRASP present these topics in a clear and concise fashion. Additionally, the book offers a very complete annotated bibliography of GRASP and combinatorial optimization. For the practitioner who needs to solve combinatorial optimization problems, the book provides a chapter with four case studies and implementable templates for all algorithms covered in the text. This book, with its excellent overview of GRASP, will appeal to researchers and practitioners of combinatorial optimization who have a need to find optimal or near optimal solutions to hard combinatorial optimization problems.
Evolutionary algorithms is a class of randomized heuristics inspired by natural evolution. They are applied in many different contexts, in particular in optimization, and analysis of such algorithms has seen tremendous advances in recent years. In this book the author provides an introduction to the methods used to analyze evolutionary algorithms and other randomized search heuristics. He starts with an algorithmic and modular perspective and gives guidelines for the design of evolutionary algorithms. He then places the approach in the broader research context with a chapter on theoretical perspectives. By adopting a complexity-theoretical perspective, he derives general limitations for black-box optimization, yielding lower bounds on the performance of evolutionary algorithms, and then develops general methods for deriving upper and lower bounds step by step. This main part is followed by a chapter covering practical applications of these methods. The notational and mathematical basics are covered in an appendix, the results presented are derived in detail, and each chapter ends with detailed comments and pointers to further reading. So the book is a useful reference for both graduate students and researchers engaged with the theoretical analysis of such algorithms.