This book constitutes the refereed proceedings of the 4th International Conference on Computability in Europe, CiE 2008, held in Athens, Greece, in June 2008. The 36 revised full papers presented together with 25 invited tutorials and lectures were carefully reviewed and selected from 108 submissions. Among them are papers of 6 special sessions entitled algorithms in the history of mathematics, formalising mathematics and extracting algorithms from proofs, higher-type recursion and applications, algorithmic game theory, quantum algorithms and complexity, and biology and computation.
th This volume is dedicated to Dov Gabbay who celebrated his 50 birthday in October 1995. Dov is one of the most outstanding and most productive researchers we have ever met. He has exerted a profound influence in major fields of logic, linguistics and computer science. His contributions in the areas of logic, language and reasoning are so numerous that a comprehensive survey would already fill half of this book. Instead of summarizing his work we decided to let him speak for himself. Sitting in a car on the way to Amsterdam airport he gave an interview to Jelle Gerbrandy and Anne-Marie Mineur. This recorded conversation with him, which is included gives a deep insight into his motivations and into his view of the world, the Almighty and, of course, the role of logic. In addition, this volume contains a partially annotated bibliography of his main papers and books. The length of the bibliography and the broadness of the topics covered there speaks for itself.
Process Algebra is a formal description technique for complex computer systems, especially those involving communicating, concurrently executing components. It is a subject that concurrently touches many topic areas of computer science and discrete math, including system design notations, logic, concurrency theory, specification and verification, operational semantics, algorithms, complexity theory, and, of course, algebra.This Handbook documents the fate of process algebra since its inception in the late 1970's to the present. It is intended to serve as a reference source for researchers, students, and system designers and engineers interested in either the theory of process algebra or in learning what process algebra brings to the table as a formal system description and verification technique. The Handbook is divided into six parts spanning a total of 19 self-contained Chapters. The organization is as follows. Part 1, consisting of four chapters, covers a broad swath of the basic theory of process algebra. Part 2 contains two chapters devoted to the sub-specialization of process algebra known as finite-state processes, while the three chapters of Part 3 look at infinite-state processes, value-passing processes and mobile processes in particular. Part 4, also three chapters in length, explores several extensions to process algebra including real-time, probability and priority. The four chapters of Part 5 examine non-interleaving process algebras, while Part 6's three chapters address process-algebra tools and applications.
Model checking is a computer-assisted method for the analysis of dynamical systems that can be modeled by state-transition systems. Drawing from research traditions in mathematical logic, programming languages, hardware design, and theoretical computer science, model checking is now widely used for the verification of hardware and software in industry. The editors and authors of this handbook are among the world's leading researchers in this domain, and the 32 contributed chapters present a thorough view of the origin, theory, and application of model checking. In particular, the editors classify the advances in this domain and the chapters of the handbook in terms of two recurrent themes that have driven much of the research agenda: the algorithmic challenge, that is, designing model-checking algorithms that scale to real-life problems; and the modeling challenge, that is, extending the formalism beyond Kripke structures and temporal logic. The book will be valuable for researchers and graduate students engaged with the development of formal methods and verification tools.
This handbook volume covers fundamental topics of semantics in logic and computation. The chapters (some monographic in length), were written following years of co-ordination and follow a thematic point of view. The volume brings the reader up to front line research, and is indispensable to any serious worker in the areas.
This book constitutes the refereed proceedings of the Second International Conference on Computability in Europe, CiE 2006, held in Swansea, UK, June/July 2006. The book presents 31 revised full papers together with 30 invited papers, including papers corresponding to 8 plenary talks and 6 special sessions on proofs and computation, computable analysis, challenges in complexity, foundations of programming, mathematical models of computers and hypercomputers, and Gödel centenary: Gödel's legacy for computability.
ICGT 2002 was the ?rst International Conference on Graph Transformation following a series of six international workshops on graph grammars with - plications in computer science, held in Bad Honnef (1978), Osnabruc ̈ k (1982), Warrenton (1986), Bremen (1990), Williamsburg (1994), and Paderborn (1998). ICGT 2002 was held in Barcelona (Spain), October 7–12, 2002 under the a- pices of the European Association of Theoretical Computer Science (EATCS), the European Association of Software Science and Technology (EASST), and the IFIP Working Group 1.3, Foundations of Systems Speci?cation. The scope of the conference concerned graphical structures of various kinds (like graphs, diagrams, visual sentences and others) that are useful to describe complex structures and systems in a direct and intuitive way. These structures are often augmented by formalisms which add to the static description a further dimension, allowing for the modeling of the evolution of systems via all kinds of transformations of such graphical structures. The ?eld of Graph Transformation is concerned with the theory, applications, and implementation issues of such formalisms. The theory is strongly related to areas such as graph theory and graph - gorithms, formal language and parsing theory, the theory of concurrent and distributed systems, formal speci?cation and veri?cation, logic, and semantics.
This is a mathematics textbook with theorems and proofs. The choice of topics has been guided by the needs of computer science students. The method of semantic tableaux provides an elegant way to teach logic that is both theoretically sound and yet sufficiently elementary for undergraduates. In order to provide a balanced treatment of logic, tableaux are related to deductive proof systems. The book presents various logical systems and contains exercises. Still further, Prolog source code is available on an accompanying Web site. The author is an Associate Professor at the Department of Science Teaching, Weizmann Institute of Science.
This book provides a comprehensive state-of-the-art, in conceptual modeling. It grew out of research papers presented at the 18th International Conference on Conceptual Modeling (ER '99) and arranged by the editors. The plan of the conference is to cover the whole spectrum of conceptual modeling as it relates to database and information systems design and to offer a complete coverage of data and process modeling, database technology, and database applications. The aim of the conference and of these proceedings is to present new insights related to each of these topics. This book contains both selected and invited papers. The 33 selected papers are organized in 11 sessions encompassing the major themes of the conference, especially : - schema transformation, evolution, and integration - temporal database design - views and reuse in conceptual modeling - advanced conceptual modeling - business process modeling and workflows - data warehouse design. Besides the selected papers, 3 invited papers present the views of three keynote speakers, internationally known for their contribution to conceptual modeling and database research and for their active role in knowledge dissemination. Peter Chen presents the results of his ongoing research on ER model, XML, and the Web. Georges Gardarin presents the first results of an ESPRIT project federating various data sources with XML and XML-QL. Finally, Matthias Jarke develops a way to capture and evaluate the experiences gained about process designs in so-called process data warehouses.