Spiking Neuron Models

Spiking Neuron Models

Author: Wulfram Gerstner

Publisher: Cambridge University Press

Published: 2002-08-15

Total Pages: 498

ISBN-13: 9780521890793

DOWNLOAD EBOOK

Neurons in the brain communicate by short electrical pulses, the so-called action potentials or spikes. How can we understand the process of spike generation? How can we understand information transmission by neurons? What happens if thousands of neurons are coupled together in a seemingly random network? How does the network connectivity determine the activity patterns? And, vice versa, how does the spike activity influence the connectivity pattern? These questions are addressed in this 2002 introduction to spiking neurons aimed at those taking courses in computational neuroscience, theoretical biology, biophysics, or neural networks. The approach will suit students of physics, mathematics, or computer science; it will also be useful for biologists who are interested in mathematical modelling. The text is enhanced by many worked examples and illustrations. There are no mathematical prerequisites beyond what the audience would meet as undergraduates: more advanced techniques are introduced in an elementary, concrete fashion when needed.


Neuronal Dynamics

Neuronal Dynamics

Author: Wulfram Gerstner

Publisher: Cambridge University Press

Published: 2014-07-24

Total Pages: 591

ISBN-13: 1107060834

DOWNLOAD EBOOK

This solid introduction uses the principles of physics and the tools of mathematics to approach fundamental questions of neuroscience.


Pulsed Neural Networks

Pulsed Neural Networks

Author: Wolfgang Maass

Publisher: MIT Press

Published: 2001-01-26

Total Pages: 414

ISBN-13: 9780262632218

DOWNLOAD EBOOK

Most practical applications of artificial neural networks are based on a computational model involving the propagation of continuous variables from one processing unit to the next. In recent years, data from neurobiological experiments have made it increasingly clear that biological neural networks, which communicate through pulses, use the timing of the pulses to transmit information and perform computation. This realization has stimulated significant research on pulsed neural networks, including theoretical analyses and model development, neurobiological modeling, and hardware implementation. This book presents the complete spectrum of current research in pulsed neural networks and includes the most important work from many of the key scientists in the field. Terrence J. Sejnowski's foreword, "Neural Pulse Coding," presents an overview of the topic. The first half of the book consists of longer tutorial articles spanning neurobiology, theory, algorithms, and hardware. The second half contains a larger number of shorter research chapters that present more advanced concepts. The contributors use consistent notation and terminology throughout the book. Contributors Peter S. Burge, Stephen R. Deiss, Rodney J. Douglas, John G. Elias, Wulfram Gerstner, Alister Hamilton, David Horn, Axel Jahnke, Richard Kempter, Wolfgang Maass, Alessandro Mortara, Alan F. Murray, David P. M. Northmore, Irit Opher, Kostas A. Papathanasiou, Michael Recce, Barry J. P. Rising, Ulrich Roth, Tim Schönauer, Terrence J. Sejnowski, John Shawe-Taylor, Max R. van Daalen, J. Leo van Hemmen, Philippe Venier, Hermann Wagner, Adrian M. Whatley, Anthony M. Zador


Principles of Neural Design

Principles of Neural Design

Author: Peter Sterling

Publisher: MIT Press

Published: 2015-05-22

Total Pages: 567

ISBN-13: 0262028700

DOWNLOAD EBOOK

Neuroscience research has exploded, with more than fifty thousand neuroscientists applying increasingly advanced methods. A mountain of new facts and mechanisms has emerged. And yet a principled framework to organize this knowledge has been missing. In this book, Peter Sterling and Simon Laughlin, two leading neuroscientists, strive to fill this gap, outlining a set of organizing principles to explain the whys of neural design that allow the brain to compute so efficiently. Setting out to "reverse engineer" the brain -- disassembling it to understand it -- Sterling and Laughlin first consider why an animal should need a brain, tracing computational abilities from bacterium to protozoan to worm. They examine bigger brains and the advantages of "anticipatory regulation"; identify constraints on neural design and the need to "nanofy"; and demonstrate the routes to efficiency in an integrated molecular system, phototransduction. They show that the principles of neural design at finer scales and lower levels apply at larger scales and higher levels; describe neural wiring efficiency; and discuss learning as a principle of biological design that includes "save only what is needed." Sterling and Laughlin avoid speculation about how the brain might work and endeavor to make sense of what is already known. Their distinctive contribution is to gather a coherent set of basic rules and exemplify them across spatial and functional scales.


How to Build a Brain

How to Build a Brain

Author: Chris Eliasmith

Publisher: Oxford University Press

Published: 2013-04-16

Total Pages: 475

ISBN-13: 0199794693

DOWNLOAD EBOOK

How to Build a Brain provides a detailed exploration of a new cognitive architecture - the Semantic Pointer Architecture - that takes biological detail seriously, while addressing cognitive phenomena. Topics ranging from semantics and syntax, to neural coding and spike-timing-dependent plasticity are integrated to develop the world's largest functional brain model.


Spike-timing dependent plasticity

Spike-timing dependent plasticity

Author: Henry Markram

Publisher: Frontiers E-books

Published:

Total Pages: 575

ISBN-13: 2889190439

DOWNLOAD EBOOK

Hebb's postulate provided a crucial framework to understand synaptic alterations underlying learning and memory. Hebb's theory proposed that neurons that fire together, also wire together, which provided the logical framework for the strengthening of synapses. Weakening of synapses was however addressed by "not being strengthened", and it was only later that the active decrease of synaptic strength was introduced through the discovery of long-term depression caused by low frequency stimulation of the presynaptic neuron. In 1994, it was found that the precise relative timing of pre and postynaptic spikes determined not only the magnitude, but also the direction of synaptic alterations when two neurons are active together. Neurons that fire together may therefore not necessarily wire together if the precise timing of the spikes involved are not tighly correlated. In the subsequent 15 years, Spike Timing Dependent Plasticity (STDP) has been found in multiple brain brain regions and in many different species. The size and shape of the time windows in which positive and negative changes can be made vary for different brain regions, but the core principle of spike timing dependent changes remain. A large number of theoretical studies have also been conducted during this period that explore the computational function of this driving principle and STDP algorithms have become the main learning algorithm when modeling neural networks. This Research Topic will bring together all the key experimental and theoretical research on STDP.


Advances in Computational Intelligence

Advances in Computational Intelligence

Author: Joan Cabestany

Publisher: Springer

Published: 2011-05-30

Total Pages: 601

ISBN-13: 3642215017

DOWNLOAD EBOOK

This two-volume set LNCS 6691 and 6692 constitutes the refereed proceedings of the 11th International Work-Conference on Artificial Neural Networks, IWANN 2011, held in Torremolinos-Málaga, Spain, in June 2011. The 154 revised papers were carefully reviewed and selected from 202 submissions for presentation in two volumes. The first volume includes 69 papers organized in topical sections on mathematical and theoretical methods in computational intelligence; learning and adaptation; bio-inspired systems and neuro-engineering; hybrid intelligent systems; applications of computational intelligence; new applications of brain-computer interfaces; optimization algorithms in graphic processing units; computing languages with bio-inspired devices and multi-agent systems; computational intelligence in multimedia processing; and biologically plausible spiking neural processing.


Single Neuron Computation

Single Neuron Computation

Author: Thomas M. McKenna

Publisher: Academic Press

Published: 2014-05-19

Total Pages: 663

ISBN-13: 1483296067

DOWNLOAD EBOOK

This book contains twenty-two original contributions that provide a comprehensive overview of computational approaches to understanding a single neuron structure. The focus on cellular-level processes is twofold. From a computational neuroscience perspective, a thorough understanding of the information processing performed by single neurons leads to an understanding of circuit- and systems-level activity. From the standpoint of artificial neural networks (ANNs), a single real neuron is as complex an operational unit as an entire ANN, and formalizing the complex computations performed by real neurons is essential to the design of enhanced processor elements for use in the next generation of ANNs.The book covers computation in dendrites and spines, computational aspects of ion channels, synapses, patterned discharge and multistate neurons, and stochastic models of neuron dynamics. It is the most up-to-date presentation of biophysical and computational methods.