This fascinating popular science journey explores key concepts in information theory in terms of Conway's "Game of Life" program. The author explains the application of natural law to a random system and demonstrates the necessity of limits. Other topics include the limits of knowledge, paradox of complexity, Maxwell's demon, Big Bang theory, and much more. 1985 edition.
A masterful work of science writing that’s "both a fascinating biography of von Neumann, the Hungarian exile whose mathematical theories were building blocks for the A-bomb and the digital computer, and a brilliant social history of game theory and its role in the Cold War and nuclear arms race" (San Francisco Chronicle). Should you watch public television without pledging?...Exceed the posted speed limit?...Hop a subway turnstile without paying? These questions illustrate the so-called "prisoner's dilemma", a social puzzle that we all face every day. Though the answers may seem simple, their profound implications make the prisoner's dilemma one of the great unifying concepts of science. Watching players bluff in a poker game inspired John von Neumann—father of the modern computer and one of the sharpest minds of the century—to construct game theory, a mathematical study of conflict and deception. Game theory was readily embraced at the RAND Corporation, the archetypical think tank charged with formulating military strategy for the atomic age, and in 1950 two RAND scientists made a momentous discovery. Called the "prisoner's dilemma," it is a disturbing and mind-bending game where two or more people may betray the common good for individual gain. Introduced shortly after the Soviet Union acquired the atomic bomb, the prisoner's dilemma quickly became a popular allegory of the nuclear arms race. Intellectuals such as von Neumann and Bertrand Russell joined military and political leaders in rallying to the "preventive war" movement, which advocated a nuclear first strike against the Soviet Union. Though the Truman administration rejected preventive war the United States entered into an arms race with the Soviets and game theory developed into a controversial tool of public policy—alternately accused of justifying arms races and touted as the only hope of preventing them. Prisoner's Dilemma is the incisive story of a revolutionary idea that has been hailed as a landmark of twentieth-century thought.
Paperback version of the 2002 paper published in the journal Progress in Information, Complexity, and Design (PCID). ABSTRACT Inasmuch as science is observational or perceptual in nature, the goal of providing a scientific model and mechanism for the evolution of complex systems ultimately requires a supporting theory of reality of which perception itself is the model (or theory-to-universe mapping). Where information is the abstract currency of perception, such a theory must incorporate the theory of information while extending the information concept to incorporate reflexive self-processing in order to achieve an intrinsic (self-contained) description of reality. This extension is associated with a limiting formulation of model theory identifying mental and physical reality, resulting in a reflexively self-generating, self-modeling theory of reality identical to its universe on the syntactic level. By the nature of its derivation, this theory, the Cognitive Theoretic Model of the Universe or CTMU, can be regarded as a supertautological reality-theoretic extension of logic. Uniting the theory of reality with an advanced form of computational language theory, the CTMU describes reality as a Self Configuring Self-Processing Language or SCSPL, a reflexive intrinsic language characterized not only by self-reference and recursive self-definition, but full self-configuration and self-execution (reflexive read-write functionality). SCSPL reality embodies a dual-aspect monism consisting of infocognition, self-transducing information residing in self-recognizing SCSPL elements called syntactic operators. The CTMU identifies itself with the structure of these operators and thus with the distributive syntax of its self-modeling SCSPL universe, including the reflexive grammar by which the universe refines itself from unbound telesis or UBT, a primordial realm of infocognitive potential free of informational constraint. Under the guidance of a limiting (intrinsic) form of anthropic principle called the Telic Principle, SCSPL evolves by telic recursion, jointly configuring syntax and state while maximizing a generalized self-selection parameter and adjusting on the fly to freely-changing internal conditions. SCSPL relates space, time and object by means of conspansive duality and conspansion, an SCSPL-grammatical process featuring an alternation between dual phases of existence associated with design and actualization and related to the familiar wave-particle duality of quantum mechanics. By distributing the design phase of reality over the actualization phase, conspansive spacetime also provides a distributed mechanism for Intelligent Design, adjoining to the restrictive principle of natural selection a basic means of generating information and complexity. Addressing physical evolution on not only the biological but cosmic level, the CTMU addresses the most evident deficiencies and paradoxes associated with conventional discrete and continuum models of reality, including temporal directionality and accelerating cosmic expansion, while preserving virtually all of the major benefits of current scientific and mathematical paradigms.
In The Powers of Ten by Charles and Ray Eames, a view of two people enjoying a picnic zooms up and away to show their surroundings, moving progressively farther into space, then zooms back in for a close-up of the hand of the picnicker, travelling deep into the microscopic realm. This is one of the most iconic examples of the “cosmic zoom,” a trope that has influenced countless media forms over the past seventy years. Horton uses the cosmic zoom as a starting point to develop a cross-disciplinary theory of scale as mediated difference. He considers the origins of our notions of scale, how scalar mediation functions differently in analog and digital modes, and how cosmic zoom media has influenced scientific and popular views of the world. Analyzing literature, film, digital media, and database history, Horton establishes a much-needed framework for thinking about scale across multiple domains and disciplines.
* The first exposition on super-recursive algorithms, systematizing all main classes and providing an accessible, focused examination of the theory and its ramifications * Demonstrates how these algorithms are more appropriate as mathematical models for modern computers and how they present a better framework for computing methods * Develops a new practically-oriented perspective on the theory of algorithms, computation, and automata, as a whole
This work presents a series of dramatic discoveries never before made public. Starting from a collection of simple computer experiments---illustrated in the book by striking computer graphics---Wolfram shows how their unexpected results force a whole new way of looking at the operation of our universe. Wolfram uses his approach to tackle a remarkable array of fundamental problems in science: from the origin of the Second Law of thermodynamics, to the development of complexity in biology, the computational limitations of mathematics, the possibility of a truly fundamental theory of physics, and the interplay between free will and determinism.
From the author of Are You Smart Enough to Work at Google?, a fascinating look at how an equation that foretells the future is transforming everything we know about life, business, and the universe. In the 18th century, the British minister and mathematician Thomas Bayes devised a theorem that allowed him to assign probabilities to events that had never happened before. It languished in obscurity for centuries until computers came along and made it easy to crunch the numbers. Now, as the foundation of big data, Bayes' formula has become a linchpin of the digital economy. But here's where things get really interesting: Bayes' theorem can also be used to lay odds on the existence of extraterrestrial intelligence; on whether we live in a Matrix-like counterfeit of reality; on the "many worlds" interpretation of quantum theory being correct; and on the biggest question of all: how long will humanity survive? The Doomsday Calculation tells how Silicon Valley's profitable formula became a controversial pivot of contemporary thought. Drawing on interviews with thought leaders around the globe, it's the story of a group of intellectual mavericks who are challenging what we thought we knew about our place in the universe. The Doomsday Calculation is compelling reading for anyone interested in our culture and its future.
This open access book constitutes the proceedings of the 24th International Conference on Foundations of Software Science and Computational Structures, FOSSACS 2021, which was held during March 27 until April 1, 2021, as part of the European Joint Conferences on Theory and Practice of Software, ETAPS 2021. The conference was planned to take place in Luxembourg and changed to an online format due to the COVID-19 pandemic. The 28 regular papers presented in this volume were carefully reviewed and selected from 88 submissions. They deal with research on theories and methods to support the analysis, integration, synthesis, transformation, and verification of programs and software systems.
This is an expansion of the author's 1991 work which investigates the implications of Gödel's writings on Einstein's theory of relativity as they relate to the fundamental questions of the nature of time and the possibilities for time travel.