This book, written by one of philosophy's pre-eminent logicians, argues that many of the basic assumptions common to logic, philosophy of mathematics and metaphysics are in need of change. It is therefore a book of critical importance to logical theory. Jaakko Hintikka proposes a new basic first-order logic and uses it to explore the foundations of mathematics. This new logic enables logicians to express on the first-order level such concepts as equicardinality, infinity, and truth in the same language. The famous impossibility results by Gödel and Tarski that have dominated the field for the last sixty years turn out to be much less significant than has been thought. All of ordinary mathematics can in principle be done on this first-order level, thus dispensing with the existence of sets and other higher-order entities.
The aim of this handbook is to create, for the first time, a systematic account of the field of spatial logic. The book comprises a general introduction, followed by fourteen chapters by invited authors. Each chapter provides a self-contained overview of its topic, describing the principal results obtained to date, explaining the methods used to obtain them, and listing the most important open problems. Jointly, these contributions constitute a comprehensive survey of this rapidly expanding subject.
This book constitutes the refereed proceedings of the Third International Conference on Computability in Europe, CiE 2007, held in Sienna, Italy, in June 2007. The 50 revised full papers presented together with 36 invited papers were carefully reviewed and selected from 167 submissions.
This book constitutes the refereed proceedings of the 13th International Conference on Algorithms and Complexity, CIAC 2023, which took place in Larnaca, Cyprus, during June 13–16, 2023. The 25 full papers included in this book were carefully reviewed and selected from 49 submissions. They cover all important areas of research on algorithms and complexity such as algorithm design and analysis; sequential, parallel and distributed algorithms; data structures; computational and structural complexity; lower bounds and limitations of algorithms; randomized and approximation algorithms; parameterized algorithms and parameterized complexity classes; smoothed analysis of algorithms; alternatives to the worst-case analysis of algorithms (e.g., algorithms with predictions), on-line computation and competitive analysis, streaming algorithms, quantum algorithms and complexity, algorithms in algebra, geometry, number theory and combinatorics, computational geometry, algorithmic game theory and mechanism design, algorithmic economics (including auctions and contests), computational learning theory, computational biology and bioinformatics, algorithmic issues in communication networks, algorithms for discrete optimization (including convex optimization) and algorithm engineering.
A comprehensive treatment of systems and software testing using state of the art methods and tools This book provides valuable insights into state of the art software testing methods and explains, with examples, the statistical and analytic methods used in this field. Numerous examples are used to provide understanding in applying these methods to real-world problems. Leading authorities in applied statistics, computer science, and software engineering present state-of-the-art methods addressing challenges faced by practitioners and researchers involved in system and software testing. Methods include: machine learning, Bayesian methods, graphical models, experimental design, generalized regression, and reliability modeling. Analytic Methods in Systems and Software Testing presents its comprehensive collection of methods in four parts: Part I: Testing Concepts and Methods; Part II: Statistical Models; Part III: Testing Infrastructures; and Part IV: Testing Applications. It seeks to maintain a focus on analytic methods, while at the same time offering a contextual landscape of modern engineering, in order to introduce related statistical and probabilistic models used in this domain. This makes the book an incredibly useful tool, offering interesting insights on challenges in the field for researchers and practitioners alike. Compiles cutting-edge methods and examples of analytical approaches to systems and software testing from leading authorities in applied statistics, computer science, and software engineering Combines methods and examples focused on the analytic aspects of systems and software testing Covers logistic regression, machine learning, Bayesian methods, graphical models, experimental design, generalized regression, and reliability models Written by leading researchers and practitioners in the field, from diverse backgrounds including research, business, government, and consulting Stimulates research at the theoretical and practical level Analytic Methods in Systems and Software Testing is an excellent advanced reference directed toward industrial and academic readers whose work in systems and software development approaches or surpasses existing frontiers of testing and validation procedures. It will also be valuable to post-graduate students in computer science and mathematics.
Nelson Goodman's acceptance and critique of certain methods and tenets of positivism, his defence of nominalism and phenomenalism, his formulation of a new riddle of induction, his work on notational systems, and his analysis of the arts place him at the forefront of the history and development of American philosophy in the twentieth-century. However, outside of America, Goodman has been a rather neglected figure. In this first book-length introduction to his work Cohnitz and Rossberg assess Goodman's lasting contribution to philosophy and show that although some of his views may be now considered unfashionable or unorthodox, there is much in Goodman's work that is of significance today. The book begins with the "grue"-paradox, which exemplifies Goodman's way of dealing with philosophical problems. After this, the unifying features of Goodman's philosophy are presented - his constructivism, conventionalism and relativism - followed by an discussion of his central work, The Structure of Appearance and its significance in the analytic tradition. The following chapters present the technical apparatus that underlies his philosophy, his mereology and semiotics, which provides the background for discussion of Goodman's aesthetics. The final chapter examines in greater depth the presuppositions underlying his philosophy.
This book constitutes the refereed proceedings of the First Annual International Frontiers of Algorithmics Workshop, FAW 2007, held in Lanzhou, China in August 2007. Topics covered in the papers include bioinformatics, discrete structures, geometric information processing and communication, games and incentive analysis, graph algorithms, internet algorithms and protocols, and algorithms in medical applications.
Parameterized complexity is currently a thriving field in complexity theory and algorithm design. A significant part of the success of the field can be attributed to Michael R. Fellows. This Festschrift has been published in honor of Mike Fellows on the occasion of his 60th birthday. It contains 20 papers that showcase the important scientific contributions of this remarkable man, describes the history of the field of parameterized complexity, and also reflects on other parts of Mike Fellows’s unique and broad range of interests, including his work on the popularization of discrete mathematics for young children. The volume contains several surveys that introduce the reader to the field of parameterized complexity and discuss important notions, results, and developments in this field.