The four-volume set LNAI 6881-LNAI 6884 constitutes the refereed proceedings of the 15th International Conference on Knowledge-Based Intelligent Information and Engineering Systems, KES 2011, held in Kaiserslautern, Germany, in September 2011. Part 4: The total of 244 high-quality papers presented were carefully reviewed and selected from numerous submissions. The 46 papers of Part 4 are organized in topical sections on human activity support in knowledge society, knowledge-based interface systems, model-based computing for innovative engineering, document analysis and knowledge science, immunity-based systems, natural language visualisation advances in theory and application of hybrid intelligent systems.
Findings from research on false memory have major implications for a number of fields central to human welfare, such as medicine and law. Although many important conclusions have been reached after a decade or so of intensive research, the majority of them are not well known outside the immediate field. To make this research accessible to a much wider audience, The Science of False Memory has been written to require little or no background knowledge of the theory and techniques used in memory research.Brainerd and Reyna introduce the volume by considering the progenitors to the modern science of false memory, and noting the remarkable degree to which core themes of contemporary research were anticipated by historical figure such as Binet, Piaget, and Bartlett. They continue with an account of the varied methods that have been used to study false memory both inside and outside of the laboratory. The first part of the volume focuses on the basic science of false memory, revolving around three topics: old and new theoretical ideas that have been used to explain false memory and make predictions about it; research findings and predictions about false memory in normal adults; and research findings and predictions about age-related changes in false memory between early childhood and adulthood. Throughout Part I, Brainerd and Reyna emphasize how current opponent-processes conceptions of false memory act as a unifying influence by integrating predictions and data across disparate forms of false memory.The second part focuses on the applied science of false memory, revolving around four topics: the falsifiability of witnesses and suspects memories of crimes, including false confessions by suspects; the falsifiability of eyewitness identifications of suspects; false-memory reports in investigative interviews of child victims and witnesses, particularly in connection with sexual-abuse crimes; false memory in psychotherapy, including recovered memories of childhood abuse, multiple-personality disorders, and recovered memories of previous lives. Although Part II is concerned with applied research, Brainerd and Reyna continue to emphasize the unifying influence of opponent-processes conceptions of false memory. The third part focuses on emerging trends, revolving around three expanding areas of false-memory research: mathematical models, aging effects, and cognitive neuroscience. False Memory will be an invaluable resource for professional researchers, practitioners, and students in the many fields for which false-memory research has implications, including child-protective services, clinical psychology, law, criminal justice, elementary and secondary education, general medicine, journalism, and psychiatry.
Winner of the 2016 De Groot Prize from the International Society for Bayesian AnalysisNow in its third edition, this classic book is widely considered the leading text on Bayesian methods, lauded for its accessible, practical approach to analyzing data and solving research problems. Bayesian Data Analysis, Third Edition continues to take an applied
Biometrics, the science of using physical traits to identify individuals, is playing an increasing role in our security-conscious society and across the globe. Biometric authentication, or bioauthentication, systems are being used to secure everything from amusement parks to bank accounts to military installations. Yet developments in this field have not been matched by an equivalent improvement in the statistical methods for evaluating these systems. Compensating for this need, this unique text/reference provides a basic statistical methodology for practitioners and testers of bioauthentication devices, supplying a set of rigorous statistical methods for evaluating biometric authentication systems. This framework of methods can be extended and generalized for a wide range of applications and tests. This is the first single resource on statistical methods for estimation and comparison of the performance of biometric authentication systems. The book focuses on six common performance metrics: for each metric, statistical methods are derived for a single system that incorporates confidence intervals, hypothesis tests, sample size calculations, power calculations and prediction intervals. These methods are also extended to allow for the statistical comparison and evaluation of multiple systems for both independent and paired data. Topics and features: * Provides a statistical methodology for the most common biometric performance metrics: failure to enroll (FTE), failure to acquire (FTA), false non-match rate (FNMR), false match rate (FMR), and receiver operating characteristic (ROC) curves * Presents methods for the comparison of two or more biometric performance metrics * Introduces a new bootstrap methodology for FMR and ROC curve estimation * Supplies more than 120 examples, using publicly available biometric data where possible * Discusses the addition of prediction intervals to the bioauthentication statistical toolset * Describes sample-size and power calculations for FTE, FTA, FNMR and FMR Researchers, managers and decisions makers needing to compare biometric systems across a variety of metrics will find within this reference an invaluable set of statistical tools. Written for an upper-level undergraduate or master’s level audience with a quantitative background, readers are also expected to have an understanding of the topics in a typical undergraduate statistics course. Dr. Michael E. Schuckers is Associate Professor of Statistics at St. Lawrence University, Canton, NY, and a member of the Center for Identification Technology Research.
This volume contains selected and invited papers presented at the International Conference on Computing and Information, ICCI '90, Niagara Falls, Ontario, Canada, May 23-26, 1990. ICCI conferences provide an international forum for presenting new results in research, development and applications in computing and information. Their primary goal is to promote an interchange of ideas and cooperation between practitioners and theorists in the interdisciplinary fields of computing, communication and information theory. The four main topic areas of ICCI '90 are: - Information and coding theory, statistics and probability, - Foundations of computer science, theory of algorithms and programming, - Concurrency, parallelism, communications, networking, computer architecture and VLSI, - Data and software engineering, databases, expert systems, information systems, decision making, and AI methodologies.
Compelling and engagingly written, this book by former Attorney General of Ohio Jim Petro and his wife, writer Nancy Petro, takes the reader inside actual cases, summarizes extensive research on the causes and consequences of wrongful conviction, and exposes eight common myths that inspire false confidence in the justice system and undermine reform. Now published in paperback with an extensive list of web links to wrongful conviction sources internationally, False Justice is ideal for use in a wide array of criminal justice and criminology courses. Myth 1: Everyone in prison claims innocence. Myth 2: Our system almost never convicts an innocent person. Myth 3: Only the guilty confess. Myth 4: Wrongful conviction is the result of innocent human error. Myth 5: An eyewitness is the best testimony. Myth 6: Conviction errors get corrected on appeal. Myth 7: It dishonors the victim to question a conviction. Myth 8: If the justice system has problems, the pros will fix them.
The second edition of this comprehensive handbook of computer and information security provides the most complete view of computer security and privacy available. It offers in-depth coverage of security theory, technology, and practice as they relate to established technologies as well as recent advances. It explores practical solutions to many security issues. Individual chapters are authored by leading experts in the field and address the immediate and long-term challenges in the authors' respective areas of expertise. The book is organized into 10 parts comprised of 70 contributed chapters by leading experts in the areas of networking and systems security, information management, cyber warfare and security, encryption technology, privacy, data storage, physical security, and a host of advanced security topics. New to this edition are chapters on intrusion detection, securing the cloud, securing web apps, ethical hacking, cyber forensics, physical security, disaster recovery, cyber attack deterrence, and more. - Chapters by leaders in the field on theory and practice of computer and information security technology, allowing the reader to develop a new level of technical expertise - Comprehensive and up-to-date coverage of security issues allows the reader to remain current and fully informed from multiple viewpoints - Presents methods of analysis and problem-solving techniques, enhancing the reader's grasp of the material and ability to implement practical solutions