The logic of information flow has applications in both computer science and natural language processing and is a growing area within mathematical and philosophical logic.
Information is a central topic in computer science, cognitive science and philosophy. In spite of its importance in the 'information age', there is no consensus on what information is, what makes it possible, and what it means for one medium to carry information about another. Drawing on ideas from mathematics, computer science and philosophy, this book addresses the definition and place of information in society. The authors, observing that information flow is possible only within a connected distribution system, provide a mathematically rigorous, philosophically sound foundation for a science of information. They illustrate their theory by applying it to a wide range of phenomena, from file transfer to DNA, from quantum mechanics to speech act theory.
Luciano Floridi presents an innovative approach to philosophy, conceived as conceptual design. He explores how we make, transform, refine, and improve the objects of our knowledge. His starting point is that reality provides the data, to be understood as constraining affordances, and we transform them into information, like semantic engines. Such transformation or repurposing is not equivalent to portraying, or picturing, or photographing, or photocopying anything. It is more like cooking: the dish does not represent the ingredients, it uses them to make something else out of them, yet the reality of the dish and its properties hugely depend on the reality and the properties of the ingredients. Models are not representations understood as pictures, but interpretations understood as data elaborations, of systems. Thus, he articulates and defends the thesis that knowledge is design and philosophy is the ultimate form of conceptual design. Although entirely independent of Floridi's previous books, The Philosophy of Information (OUP 2011) and The Ethics of Information (OUP 2013), The Logic of Information both complements the existing volumes and presents new work on the foundations of the philosophy of information.
A comprehensive examination of the interfaces of logic, computer science, and game theory, drawing on twenty years of research on logic and games. This book draws on ideas from philosophical logic, computational logic, multi-agent systems, and game theory to offer a comprehensive account of logic and games viewed in two complementary ways. It examines the logic of games: the development of sophisticated modern dynamic logics that model information flow, communication, and interactive structures in games. It also examines logic as games: the idea that logical activities of reasoning and many related tasks can be viewed in the form of games. In doing so, the book takes up the “intelligent interaction” of agents engaging in competitive or cooperative activities and examines the patterns of strategic behavior that arise. It develops modern logical systems that can analyze information-driven changes in players' knowledge and beliefs, and introduces the “Theory of Play” that emerges from the combination of logic and game theory. This results in a new view of logic itself as an interactive rational activity based on reasoning, perception, and communication that has particular relevance for games. Logic in Games, based on a course taught by the author at Stanford University, the University of Amsterdam, and elsewhere, can be used in advanced seminars and as a resource for researchers.
Except from the ForewordThe stated aim of the book series "Capturing Intelligence" is to publish books on research from all disciplines dealing with and affecting the issue of understanding and reproducing intelligence artificial systems. Of course, much of the work done in the past decades in this area has been of a highly technical nature, varying from hardware design for robots, software design for intelligent agents, and formal logic for reasoning.It is therefore very refreshing to see Information Flow and Knowledge Sharing. This is a courageous book indeed. It is not afraid to tackle the Big Issues: notions such as information, knowledge, information system, information flow, collaborative problem solving, and ontological reasoning. All of these notions are crucial to our understanding of intelligence and our building of intelligent artificial systems, but all too often, these Big Issues are hidden behind the curtains while the technical topics take center stage. AI has a rich history of philosophical books that have chosen a non-standard structure and narrative. It is nice to see that the authors have succeeded into combining a non-standard approach to deep questions with a non-standard format, resulting in a highly interesting volume.Frank van Harmelen, Series EditorExcerpt from the IntroductionOur interest is to promote, through a better and deeper understanding of the notions of information and knowledge, a better and deeper critical understanding of information technology as situated in the full range of human activities, assuming as a principle that this range of activities cannot be properly appreciated when it is reduced to the simplified means-end schema proposed by Technology. We invite the reader to build his/her own points of view about these notions, considering our propositions as a starting point for a critical analysis and discussion of these points. With that, we believe we are contributing to a better understanding of the impact of technology – and particularly of Information Technology – in everyday life. Flavio Soares Correa da Silva, Jaume Agusti-Cullell - Bridges the gap between the technological and philosophical aspects of information technology - Analyzes essential notions of IT such as information, knowledge, information system, information flow, collaborative problem solving, and ontological reasoning
Situation Theory and situation semantics are recent approaches to language and information, approaches first formulated by Jon Barwise and John Perry in Situations and Attitudes (1983). The present volume collects some of Barwise's papers written since then, those directly concerned with relations among logic, situation theory, and situation semantics. Several papers appear here for the first time.
This book is conceived as an introductory text into the theory of syntactic and semantic information, and information flow. Syntactic information theory is concerned with the information contained in the very fact that some signal has a non-random structure. Semantic information theory is concerned with the meaning or information content of messages and the like. The theory of information flow is concerned with deriving some piece of information from another. The main part will take us to situation semantics as a foundation of modern approaches in information theory. We give a brief overview of the background theory and then explain the concepts of information, information architecture and information flow from that perspective.
Information is a central topic in computer science, cognitive science and philosophy. Drawing on ideas from these subjects, this book addresses the definition and place of information in society.
Intelligence can be characterised both as the ability to absorb and process information and as the ability to reason. Humans and other animals have both of these abilities to a greater or lesser degree, but the search for artificial intelligence has been hampered by our inability to create a theory that covers both of these characteristics. In this provocative and ground-breaking book, Professor Keith Devlin argues that to obtain a deeper understanding of the nature of intelligence and knowledge acquisition, we must broaden our concept of logic. For these purposes, Devlin introduces the concept of the infon, a quantum of information, and merges it with situations, a mathematical construction generalising the notion of sets developed by Barwise and Perry at Stanford University in order to study the meaning of natural languages. He develops and describes the theory here in general and intuitive terms, and discusses its relevance to a variety of concerns such as artificial intelligence, cognition, natural language and communication.
This book is an exploration of current trends in logical theories of information flow across various fields, such as belief revision in computer science or dynamic semantics in linguistics. It provides one mathematical perspective encompassing all of these. This framework generates a new agenda of questions concerning dynamic inference and dynamic operators. The result is a mathematical theory of process models, simulations between these, and modal languages over them, which is developed in quite some detail. New results include theorems on expressive completeness, representation of styles of inference, and new kinds of decidable remodeling for standard logics. This theory is also confronted with practice in computer science, linguistics and philosophy.