In this richly illustrated book, it is shown how Shannon's mathematical theory of information defines absolute limits on neural efficiency; limits which ultimately determine the neuroanatomical microstructure of the eye and brain. Written in an informal style this is an ideal introduction to cutting-edge research in neural information theory.
Understanding how populations of neurons encode information is the challenge faced by researchers in the field of neural coding. Focusing on the many mysteries and marvels of the mind has prompted a prominent team of experts in the field to put their heads together and fire up a book on the subject. Simply titled Principles of Neural Coding, this b
Comprehensive introduction to the neural network models currently under intensive study for computational applications. It also provides coverage of neural network applications in a variety of problems of both theoretical and practical interest.
Neuroscience research has exploded, with more than fifty thousand neuroscientists applying increasingly advanced methods. A mountain of new facts and mechanisms has emerged. And yet a principled framework to organize this knowledge has been missing. In this book, Peter Sterling and Simon Laughlin, two leading neuroscientists, strive to fill this gap, outlining a set of organizing principles to explain the whys of neural design that allow the brain to compute so efficiently. Setting out to "reverse engineer" the brain -- disassembling it to understand it -- Sterling and Laughlin first consider why an animal should need a brain, tracing computational abilities from bacterium to protozoan to worm. They examine bigger brains and the advantages of "anticipatory regulation"; identify constraints on neural design and the need to "nanofy"; and demonstrate the routes to efficiency in an integrated molecular system, phototransduction. They show that the principles of neural design at finer scales and lower levels apply at larger scales and higher levels; describe neural wiring efficiency; and discuss learning as a principle of biological design that includes "save only what is needed." Sterling and Laughlin avoid speculation about how the brain might work and endeavor to make sense of what is already known. Their distinctive contribution is to gather a coherent set of basic rules and exemplify them across spatial and functional scales.
Neural network research often builds on the fiction that neurons are simple linear threshold units, completely neglecting the highly dynamic and complex nature of synapses, dendrites, and voltage-dependent ionic currents. Biophysics of Computation: Information Processing in Single Neurons challenges this notion, using richly detailed experimental and theoretical findings from cellular biophysics to explain the repertoire of computational functions available to single neurons. The author shows how individual nerve cells can multiply, integrate, or delay synaptic inputs and how information can be encoded in the voltage across the membrane, in the intracellular calcium concentration, or in the timing of individual spikes.Key topics covered include the linear cable equation; cable theory as applied to passive dendritic trees and dendritic spines; chemical and electrical synapses and how to treat them from a computational point of view; nonlinear interactions of synaptic input in passive and active dendritic trees; the Hodgkin-Huxley model of action potential generation and propagation; phase space analysis; linking stochastic ionic channels to membrane-dependent currents; calcium and potassium currents and their role in information processing; the role of diffusion, buffering and binding of calcium, and other messenger systems in information processing and storage; short- and long-term models of synaptic plasticity; simplified models of single cells; stochastic aspects of neuronal firing; the nature of the neuronal code; and unconventional models of sub-cellular computation.Biophysics of Computation: Information Processing in Single Neurons serves as an ideal text for advanced undergraduate and graduate courses in cellular biophysics, computational neuroscience, and neural networks, and will appeal to students and professionals in neuroscience, electrical and computer engineering, and physics.
Information theory and inference, taught together in this exciting textbook, lie at the heart of many important areas of modern technology - communication, signal processing, data mining, machine learning, pattern recognition, computational neuroscience, bioinformatics and cryptography. The book introduces theory in tandem with applications. Information theory is taught alongside practical communication systems such as arithmetic coding for data compression and sparse-graph codes for error-correction. Inference techniques, including message-passing algorithms, Monte Carlo methods and variational approximations, are developed alongside applications to clustering, convolutional codes, independent component analysis, and neural networks. Uniquely, the book covers state-of-the-art error-correcting codes, including low-density-parity-check codes, turbo codes, and digital fountain codes - the twenty-first-century standards for satellite communications, disk drives, and data broadcast. Richly illustrated, filled with worked examples and over 400 exercises, some with detailed solutions, the book is ideal for self-learning, and for undergraduate or graduate courses. It also provides an unparalleled entry point for professionals in areas as diverse as computational biology, financial engineering and machine learning.
Originally developed by Claude Shannon in the 1940s, information theory laid the foundations for the digital revolution, and is now an essential tool in telecommunications, genetics, linguistics, brain sciences, and deep space communication. In this richly illustrated book, accessible examples are used to introduce information theory in terms of everyday games like ‘20 questions’ before more advanced topics are explored. Online MatLab and Python computer programs provide hands-on experience of information theory in action, and PowerPoint slides give support for teaching. Written in an informal style, with a comprehensive glossary and tutorial appendices, this text is an ideal primer for novices who wish to learn the essential principles and applications of information theory.