Since its founding in 1989 by Terrence Sejnowski, Neural Computation has become the leading journal in the field. Foundations of Neural Computation collects, by topic, the most significant papers that have appeared in the journal over the past nine years. The present volume focuses on neural codes and representations, topics of broad interest to neuroscientists and modelers. The topics addressed are: how neurons encode information through action potential firing patterns, how populations of neurons represent information, and how individual neurons use dendritic processing and biophysical properties of synapses to decode spike trains. The papers encompass a wide range of levels of investigation, from dendrites and neurons to networks and systems.
This eBook contains ten articles on the topic of representation of abstract concepts, both simple and complex, at the neural level in the brain. Seven of the articles directly address the main competing theories of mental representation – localist and distributed. Four of these articles argue – either on a theoretical basis or with neurophysiological evidence – that abstract concepts, simple or complex, exist (have to exist) at either the single cell level or in an exclusive neural cell assembly. There are three other papers that argue for sparse distributed representation (population coding) of abstract concepts. There are two other papers that discuss neural implementation of symbolic models. The remaining paper deals with learning of motor skills from imagery versus actual execution. A summary of these papers is provided in the Editorial.
How visual content is represented in neuronal population codes and how to analyze such codes with multivariate techniques. Vision is a massively parallel computational process, in which the retinal image is transformed over a sequence of stages so as to emphasize behaviorally relevant information (such as object category and identity) and deemphasize other information (such as viewpoint and lighting). The processes behind vision operate by concurrent computation and message passing among neurons within a visual area and between different areas. The theoretical concept of "population code" encapsulates the idea that visual content is represented at each stage by the pattern of activity across the local population of neurons. Understanding visual population codes ultimately requires multichannel measurement and multivariate analysis of activity patterns. Over the past decade, the multivariate approach has gained significant momentum in vision research. Functional imaging and cell recording measure brain activity in fundamentally different ways, but they now use similar theoretical concepts and mathematical tools in their modeling and analyses. With a focus on the ventral processing stream thought to underlie object recognition, this book presents recent advances in our understanding of visual population codes, novel multivariate pattern-information analysis techniques, and the beginnings of a unified perspective for cell recording and functional imaging. It serves as an introduction, overview, and reference for scientists and students across disciplines who are interested in human and primate vision and, more generally, in understanding how the brain represents and processes information.
This second edition presents the enormous progress made in recent years in the many subfields related to the two great questions : how does the brain work? and, How can we build intelligent machines? This second edition greatly increases the coverage of models of fundamental neurobiology, cognitive neuroscience, and neural network approaches to language. (Midwest).
Understanding how populations of neurons encode information is the challenge faced by researchers in the field of neural coding. Focusing on the many mysteries and marvels of the mind has prompted a prominent team of experts in the field to put their heads together and fire up a book on the subject. Simply titled Principles of Neural Coding, this book covers the complexities of this discipline. It centers on some of the major developments in this area and presents a complete assessment of how neurons in the brain encode information. The book collaborators contribute various chapters that describe results in different systems (visual, auditory, somatosensory perception, etc.) and different species (monkeys, rats, humans, etc). Concentrating on the recording and analysis of the firing of single and multiple neurons, and the analysis and recording of other integrative measures of network activity and network states—such as local field potentials or current source densities—is the basis of the introductory chapters. Provides a comprehensive and interdisciplinary approach Describes topics of interest to a wide range of researchers The book then moves forward with the description of the principles of neural coding for different functions and in different species and concludes with theoretical and modeling works describing how information processing functions are implemented. The text not only contains the most important experimental findings, but gives an overview of the main methodological aspects for studying neural coding. In addition, the book describes alternative approaches based on simulations with neural networks and in silico modeling in this highly interdisciplinary topic. It can serve as an important reference to students and professionals.
A textbook for students with limited background in mathematics and computer coding, emphasizing computer tutorials that guide readers in producing models of neural behavior. This introductory text teaches students to understand, simulate, and analyze the complex behaviors of individual neurons and brain circuits. It is built around computer tutorials that guide students in producing models of neural behavior, with the associated Matlab code freely available online. From these models students learn how individual neurons function and how, when connected, neurons cooperate in a circuit. The book demonstrates through simulated models how oscillations, multistability, post-stimulus rebounds, and chaos can arise within either single neurons or circuits, and it explores their roles in the brain. The book first presents essential background in neuroscience, physics, mathematics, and Matlab, with explanations illustrated by many example problems. Subsequent chapters cover the neuron and spike production; single spike trains and the underlying cognitive processes; conductance-based models; the simulation of synaptic connections; firing-rate models of large-scale circuit operation; dynamical systems and their components; synaptic plasticity; and techniques for analysis of neuron population datasets, including principal components analysis, hidden Markov modeling, and Bayesian decoding. Accessible to undergraduates in life sciences with limited background in mathematics and computer coding, the book can be used in a “flipped” or “inverted” teaching approach, with class time devoted to hands-on work on the computer tutorials. It can also be a resource for graduate students in the life sciences who wish to gain computing skills and a deeper knowledge of neural function and neural circuits.
A comprehensive, integrated, and accessible textbook presenting core neuroscientific topics from a computational perspective, tracing a path from cells and circuits to behavior and cognition. This textbook presents a wide range of subjects in neuroscience from a computational perspective. It offers a comprehensive, integrated introduction to core topics, using computational tools to trace a path from neurons and circuits to behavior and cognition. Moreover, the chapters show how computational neuroscience—methods for modeling the causal interactions underlying neural systems—complements empirical research in advancing the understanding of brain and behavior. The chapters—all by leaders in the field, and carefully integrated by the editors—cover such subjects as action and motor control; neuroplasticity, neuromodulation, and reinforcement learning; vision; and language—the core of human cognition. The book can be used for advanced undergraduate or graduate level courses. It presents all necessary background in neuroscience beyond basic facts about neurons and synapses and general ideas about the structure and function of the human brain. Students should be familiar with differential equations and probability theory, and be able to pick up the basics of programming in MATLAB and/or Python. Slides, exercises, and other ancillary materials are freely available online, and many of the models described in the chapters are documented in the brain operation database, BODB (which is also described in a book chapter). Contributors Michael A. Arbib, Joseph Ayers, James Bednar, Andrej Bicanski, James J. Bonaiuto, Nicolas Brunel, Jean-Marie Cabelguen, Carmen Canavier, Angelo Cangelosi, Richard P. Cooper, Carlos R. Cortes, Nathaniel Daw, Paul Dean, Peter Ford Dominey, Pierre Enel, Jean-Marc Fellous, Stefano Fusi, Wulfram Gerstner, Frank Grasso, Jacqueline A. Griego, Ziad M. Hafed, Michael E. Hasselmo, Auke Ijspeert, Stephanie Jones, Daniel Kersten, Jeremie Knuesel, Owen Lewis, William W. Lytton, Tomaso Poggio, John Porrill, Tony J. Prescott, John Rinzel, Edmund Rolls, Jonathan Rubin, Nicolas Schweighofer, Mohamed A. Sherif, Malle A. Tagamets, Paul F. M. J. Verschure, Nathan Vierling-Claasen, Xiao-Jing Wang, Christopher Williams, Ransom Winder, Alan L. Yuille
Intended for neurobiologists with an interest in mathematical analysis of neural data as well as the growing number of physicists and mathematicians interested in information processing by "real" nervous systems, Spikes provides a self-contained review of relevant concepts in information theory and statistical decision theory.
Experimental and theoretical neuroscientists use Bayesian approaches to analyze the brain mechanisms of perception, decision-making, and motor control.