The interaction paradigm is a new conceptualization of computational phenomena that emphasizes interaction over algorithms, reflecting the shift in technology from main-frame number-crunching to distributed intelligent networks with graphical user interfaces. The book is arranged in four sections: "Introduction", comprising three chapters that explore and summarize the fundamentals of interactive computation; "Theory" with six chapters, each discussing a specific aspect of interaction; "Applications," five chapters showing how this principle is applied in subdisciplines of computer science; and "New Directions," presenting four multidisciplinary applications. The book challenges traditional Turing machine-based answers to fundamental questions of problem solving and the scope of computation.
Presents problems and methodologies related to the syntax, semantics, and ambiguities of visual languages. Defines and formalizes visual languages for interactive computing, as well as visual notation interpretation.
The book outlines selected projects conducted under the supervision of the author. Moreover, it discusses significant relations between Interactive Granular Computing (IGrC) and numerous dynamically developing scientific domains worldwide, along with features characteristic of the author’s approach to IGrC. The results presented are a continuation and elaboration of various aspects of Wisdom Technology, initiated and developed in cooperation with Professor Andrzej Skowron. Based on the empirical findings from these projects, the author explores the following areas: (a) understanding the causes of the theory and practice gap problem (TPGP) in complex systems engineering (CSE); (b) generalizing computing models of complex adaptive systems (CAS) (in particular, natural computing models) by constructing an interactive granular computing (IGrC) model of networks of interrelated interacting complex granules (c-granules), belonging to a single agent and/or to a group of agents; (c) developing methodologies based on the IGrC model to minimize the negative consequences of the TPGP. The book introduces approaches to the above issues, using the proposed IGrC model. In particular, the IGrC model refers to the key mechanisms used to control the processes related to the implementation of CSE projects. One of the main aims was to develop a mechanism of IGrC control over computations that model a project’s implementation processes to maximize the chances of its success, while at the same time minimizing the emerging risks. In this regard, the IGrC control is usually performed by means of properly selected and enforced (among project participants) project principles. These principles constitute examples of c-granules, expressed by complex vague concepts (represented by c-granules too). The c-granules evolve with time (in particular, the meaning of the concepts is also subject of change). This methodology is illustrated using project principles applied by the author during the implementation of the POLTAX, AlgoTradix, Merix, and Excavio projects outlined in the book.
This book constitutes the refereed proceedings of the 18th International Symposium Fundamentals of Computation Theory, FCT 2011, held in Oslo, Norway, in August 2011. The 28 revised full papers presented were carefully reviewed and selected from 78 submissions. FCT 2011 focused on algorithms, formal methods, and emerging fields, such as ad hoc, dynamic and evolving systems; algorithmic game theory; computational biology; foundations of cloud computing and ubiquitous systems; and quantum computation.
The book reports on the latest theories on artificial neural networks, with a special emphasis on bio-neuroinformatics methods. It includes twenty-three papers selected from among the best contributions on bio-neuroinformatics-related issues, which were presented at the International Conference on Artificial Neural Networks, held in Sofia, Bulgaria, on September 10-13, 2013 (ICANN 2013). The book covers a broad range of topics concerning the theory and applications of artificial neural networks, including recurrent neural networks, super-Turing computation and reservoir computing, double-layer vector perceptrons, nonnegative matrix factorization, bio-inspired models of cell communities, Gestalt laws, embodied theory of language understanding, saccadic gaze shifts and memory formation, and new training algorithms for Deep Boltzmann Machines, as well as dynamic neural networks and kernel machines. It also reports on new approaches to reinforcement learning, optimal control of discrete time-delay systems, new algorithms for prototype selection, and group structure discovering. Moreover, the book discusses one-class support vector machines for pattern recognition, handwritten digit recognition, time series forecasting and classification, and anomaly identification in data analytics and automated data analysis. By presenting the state-of-the-art and discussing the current challenges in the fields of artificial neural networks, bioinformatics and neuroinformatics, the book is intended to promote the implementation of new methods and improvement of existing ones, and to support advanced students, researchers and professionals in their daily efforts to identify, understand and solve a number of open questions in these fields.
This book constitutes the refereed proceedings of the first International Conference on Computability in Europe, CiE 2005, held in Amsterdam, The Netherlands in June 2005. The 68 revised full papers presented were carefully reviewed and selected from 144 submissions. Among them are papers corresponding to two tutorials, six plenary talks and papers of six special sessions involving mathematical logic and computer science at the same time as offering the methodological foundations for models of computation. The papers address many aspects of computability in Europe with a special focus on new computational paradigms. These include first of all connections between computation and physical systems (e.g., quantum and analog computation, neural nets, molecular computation), but also cover new perspectives on models of computation arising from basic research in mathematical logic and theoretical computer science.
From the winner of the Turing Award and the Abel Prize, an introduction to computational complexity theory, its connections and interactions with mathematics, and its central role in the natural and social sciences, technology, and philosophy Mathematics and Computation provides a broad, conceptual overview of computational complexity theory—the mathematical study of efficient computation. With important practical applications to computer science and industry, computational complexity theory has evolved into a highly interdisciplinary field, with strong links to most mathematical areas and to a growing number of scientific endeavors. Avi Wigderson takes a sweeping survey of complexity theory, emphasizing the field’s insights and challenges. He explains the ideas and motivations leading to key models, notions, and results. In particular, he looks at algorithms and complexity, computations and proofs, randomness and interaction, quantum and arithmetic computation, and cryptography and learning, all as parts of a cohesive whole with numerous cross-influences. Wigderson illustrates the immense breadth of the field, its beauty and richness, and its diverse and growing interactions with other areas of mathematics. He ends with a comprehensive look at the theory of computation, its methodology and aspirations, and the unique and fundamental ways in which it has shaped and will further shape science, technology, and society. For further reading, an extensive bibliography is provided for all topics covered. Mathematics and Computation is useful for undergraduate and graduate students in mathematics, computer science, and related fields, as well as researchers and teachers in these fields. Many parts require little background, and serve as an invitation to newcomers seeking an introduction to the theory of computation. Comprehensive coverage of computational complexity theory, and beyond High-level, intuitive exposition, which brings conceptual clarity to this central and dynamic scientific discipline Historical accounts of the evolution and motivations of central concepts and models A broad view of the theory of computation's influence on science, technology, and society Extensive bibliography
This book constitutes the refereed proceedings of the 33rd Conference on Current Trends in Theory and Practice of Computer Science, SOFSEM 2007, held in Harrachov, Czech Republic in January 2007. The 69 revised full papers, presented together with 11 invited contributions were carefully reviewed and selected from 283 submissions. The papers were organized in four topical tracks.
This volume addresses the emerging area of human computation, The chapters, written by leading international researchers, explore existing and future opportunities to combine the respective strengths of both humans and machines in order to create powerful problem-solving capabilities. The book bridges scientific communities, capturing and integrating the unique perspective and achievements of each. It coalesces contributions from industry and across related disciplines in order to motivate, define, and anticipate the future of this exciting new frontier in science and cultural evolution. Readers can expect to find valuable contributions covering Foundations; Application Domains; Techniques and Modalities; Infrastructure and Architecture; Algorithms; Participation; Analysis; Policy and Security and the Impact of Human Computation. Researchers and professionals will find the Handbook of Human Computation a valuable reference tool. The breadth of content also provides a thorough foundation for students of the field.