Proper treatment of structural behavior under severe loading - such as the performance of a high-rise building during an earthquake - relies heavily on the use of probability-based analysis and decision-making tools. Proper application of these tools is significantly enhanced by a thorough understanding of the underlying theoretical and computation
This review volume consists of a set of chapters written by leading scholars, most of them founders of their fields. It explores the connections of Randomness to other areas of scientific knowledge, especially its fruitful relationship to Computability and Complexity Theory, and also to areas such as Probability, Statistics, Information Theory, Biology, Physics, Quantum Mechanics, Learning Theory and Artificial Intelligence. The contributors cover these topics without neglecting important philosophical dimensions, sometimes going beyond the purely technical to formulate age old questions relating to matters such as determinism and free will.The scope of Randomness Through Computation is novel. Each contributor shares their personal views and anecdotes on the various reasons and motivations which led them to the study of Randomness. Using a question and answer format, they share their visions from their several distinctive vantage points.
This review volume consists of an indispensable set of chapters written by leading scholars, scientists and researchers in the field of Randomness, including related subfields specially but not limited to the strong developed connections to the Computability and Recursion Theory. Highly respected, indeed renowned in their areas of specialization, many of these contributors are the founders of their fields. The scope of Randomness Through Computation is novel. Each contributor shares his personal views and anecdotes on the various reasons and motivations which led him to the study of the subject. They share their visions from their vantage and distinctive viewpoints. In summary, this is an opportunity to learn about the topic and its various angles from the leading thinkers.
Computability and complexity theory are two central areas of research in theoretical computer science. This book provides a systematic, technical development of "algorithmic randomness" and complexity for scientists from diverse fields.
Monte Carlo simulation has become one of the most important tools in all fields of science. This book surveys the basic techniques and principles of the subject, as well as general techniques useful in more complicated models and in novel settings. The emphasis throughout is on practical methods that work well in current computing environments.
Randomization and probabilistic techniques play an important role in modern computer science, with applications ranging from combinatorial optimization and machine learning to communication networks and secure protocols. This 2005 textbook is designed to accompany a one- or two-semester course for advanced undergraduates or beginning graduate students in computer science and applied mathematics. It gives an excellent introduction to the probabilistic techniques and paradigms used in the development of probabilistic algorithms and analyses. It assumes only an elementary background in discrete mathematics and gives a rigorous yet accessible treatment of the material, with numerous examples and applications. The first half of the book covers core material, including random sampling, expectations, Markov's inequality, Chevyshev's inequality, Chernoff bounds, the probabilistic method and Markov chains. The second half covers more advanced topics such as continuous probability, applications of limited independence, entropy, Markov chain Monte Carlo methods and balanced allocations. With its comprehensive selection of topics, along with many examples and exercises, this book is an indispensable teaching tool.
The interplay between computability and randomness has been an active area of research in recent years, reflected by ample funding in the USA, numerous workshops, and publications on the subject. The complexity and the randomness aspect of a set of natural numbers are closely related. Traditionally, computability theory is concerned with the complexity aspect. However, computability theoretic tools can also be used to introduce mathematical counterparts for the intuitive notion of randomness of a set. Recent research shows that, conversely, concepts and methods originating from randomness enrich computability theory. The book covers topics such as lowness and highness properties, Kolmogorov complexity, betting strategies and higher computability. Both the basics and recent research results are desribed, providing a very readable introduction to the exciting interface of computability and randomness for graduates and researchers in computability theory, theoretical computer science, and measure theory.
The last two decades have seen a wave of exciting new developments in the theory of algorithmic randomness and its applications to other areas of mathematics. This volume surveys much of the recent work that has not been included in published volumes until now. It contains a range of articles on algorithmic randomness and its interactions with closely related topics such as computability theory and computational complexity, as well as wider applications in areas of mathematics including analysis, probability, and ergodic theory. In addition to being an indispensable reference for researchers in algorithmic randomness, the unified view of the theory presented here makes this an excellent entry point for graduate students and other newcomers to the field.
Recent findings in the computer sciences, discrete mathematics, formal logics and metamathematics have opened up a royal road for the investigation of undecidability and randomness in physics. A translation of these formal concepts yields a fresh look into diverse features of physical modelling such as quantum complementarity and the measurement problem, but also stipulates questions related to the necessity of the assumption of continua.Conversely, any computer may be perceived as a physical system: not only in the immediate sense of the physical properties of its hardware. Computers are a medium to virtual realities. The foreseeable importance of such virtual realities stimulates the investigation of an ?inner description?, a ?virtual physics? of these universes of computation. Indeed, one may consider our own universe as just one particular realisation of an enormous number of virtual realities, most of them awaiting discovery.One motive of this book is the recognition that what is often referred to as ?randomness? in physics might actually be a signature of undecidability for systems whose evolution is computable on a step-by-step basis. To give a flavour of the type of questions envisaged: Consider an arbitrary algorithmic system which is computable on a step-by-step basis. Then it is in general impossible to specify a second algorithmic procedure, including itself, which, by experimental input-output analysis, is capable of finding the deterministic law of the first system. But even if such a law is specified beforehand, it is in general impossible to predict the system behaviour in the ?distant future?. In other words: no ?speedup? or ?computational shortcut? is available. In this approach, classical paradoxes can be formally translated into no-go theorems concerning intrinsic physical perception.It is suggested that complementarity can be modelled by experiments on finite automata, where measurements of one observable of the automaton destroys the possibility to measure another observable of the same automaton and it vice versa.Besides undecidability, a great part of the book is dedicated to a formal definition of randomness and entropy measures based on algorithmic information theory.
From the ancients' first readings of the innards of birds to your neighbor's last bout with the state lottery, humankind has put itself into the hands of chance. Today life itself may be at stake when probability comes into play--in the chance of a false negative in a medical test, in the reliability of DNA findings as legal evidence, or in the likelihood of passing on a deadly congenital disease--yet as few people as ever understand the odds. This book is aimed at the trouble with trying to learn about probability. A story of the misconceptions and difficulties civilization overcame in progressing toward probabilistic thinking, Randomness is also a skillful account of what makes the science of probability so daunting in our own day. To acquire a (correct) intuition of chance is not easy to begin with, and moving from an intuitive sense to a formal notion of probability presents further problems. Author Deborah Bennett traces the path this process takes in an individual trying to come to grips with concepts of uncertainty and fairness, and also charts the parallel path by which societies have developed ideas about chance. Why, from ancient to modern times, have people resorted to chance in making decisions? Is a decision made by random choice fair? What role has gambling played in our understanding of chance? Why do some individuals and societies refuse to accept randomness at all? If understanding randomness is so important to probabilistic thinking, why do the experts disagree about what it really is? And why are our intuitions about chance almost always dead wrong? Anyone who has puzzled over a probability conundrum is struck by the paradoxes and counterintuitive results that occur at a relatively simple level. Why this should be, and how it has been the case through the ages, for bumblers and brilliant mathematicians alike, is the entertaining and enlightening lesson of Randomness.