Stochastic Methods in Neuroscience
Editors: Carlo Laing and Gabriel Lord
Here are abstracts of the chapters
- Preface
Authors : Carlo Laing and Gabriel Lord
We give a brief introduction to modelling in mathematical neuroscience, to stochastic processes and stochastic differential equations as well as an overview of the book. - Ch 1: A brief introduction to some simple
stochastic processes
Author: Benjamin Lindner
This chapter gives an overview over simple continuous, two-state, and point processes playing a role in theoretical neuroscience. First, various characteristics of these stochastic processes are introduced such as probability densities, moments, correlation functions, the correlation time, and the noise intensity of a process. Then analytical and numerical methods to calculate or compute these various statistics are explained and illustrated by means of simple examples (Ornstein-Uhlenbeck process, random telegraph noise, Poissonian shot noise). Further, useful relations among the different statistics (Wiener-Khinchin theorem, relations between spectral and interval statistics of point processes) are also discussed.
Keywords: Stochastic process, point process, dynamical noise, analytical methods - Ch 2: Markov chain models of ion channels and calcium release
sites
Authors: Gregory D. Smith with Hilary DeRemigio and Jeffrey R. Groff:
This chapter is an introduction to modeling stochastically gating ion channels using continuous-time discrete-state Markov chains. Analytical and numerical methods are presented for determining steady-state statistics of single channel gating, including the stationary distribution and open and closed dwell times. Model reduction techniques such as fast-slow analysis and state lumping are discussed as well as Gillespie's method for simulating stochastically gating ion channels. Techniques for the estimation of model parameters and identification of model topology are briefly discussed, as well as the thermodynamic requirements that constrain the selection of rate constants. Approaches for modeling clusters of interacting ion channels using Markov chains are also summarized. Our presentation is restricted to Markov chain models of intracellular calcium release sites where clusters of calcium release channels are coupled via changes in the local calcium concentration and exhibit stochastic calcium excitability reminiscent of calcium puffs and sparks. Representative release site simulations are presented showing how phenomena such as allosteric coupling and calcium-dependent inactivation, in addition to calcium-dependent activation, affect the generation and termination of calcium puffs and sparks. The chapter concludes by considering the state space explosion that occurs as more channels are included in Markov chain models of calcium release sites. Techniques used to mitigate against this state space explosion are discussed, including the use of Kronecker representations and mean-field approximations.
Keywords: Markov Chains, Ion Channel Gating, Coupled Gating, Intracellular Calcium Release, Inositol 1,4,5-Trisphosphate Receptors, Ryanodine Receptors, Calcium Coupling, Allosteric Coupling, Mean-Field Coupling, Puffs, Sparks, Stochastic Automata. - Ch 3: Stochastic dynamic bifurcations and excitability
Authors Nils Berglund and Barbara Gentz
Some models of action potential generation in neurons like the Fitzhugh--Nagumo and the Morris--Lecar model are given by slow--fast differential equations. We outline a general theory allowing to quantify the effect of noise on such equations. The method combines local analyses around stable and unstable equilibria, and around bifurcation points. We discuss in particular two different mechanisms of excitability, which lead to different types of interspike statistics.
Keywords: Action potential generation Stochastic differential equations, Slow-fast dynamical systems, Dynamic bifurcations, Excitability, Interspike times, Fitzhugh--Nagumo model, Morris--Lecar model. - Ch 4: Neural coherence and stochastic resonance
Author Andre Longtin :
This chapter concerns the influence of noise and periodic rhythms on the firing patterns of neurons in their subthreshold regime. Such a regime conceals many computations that lead to successive decisions to fire or not fire, and noise and rhythms are important components of these decisions. We first consider a TypeII neuron model, the FitzHugh-Nagumo model, characterized by a resonant frequency. In the subthreshold regime, noise induces firings with a regularity that increases with noise intensity. At a certain finite noise level, the regularity may be maximized, but this depends on the numerical implementation of an absolute refractory period. We discuss measures of this coherence resonance based on the coefficient of variation (CV) of interspike intervals and spike train power spectra. We then characterize its phase locking to periodic input, and how this locking is modified by noise. This lays the foundation for understanding how noise can express subthreshold signals in the spike train. We discuss measures and qualitative features of this stochastic resonance across all time scales of periodic forcing. We show how the resonance relates to firing once per forcing cycle, on average, or sub-multiples thereof at higher forcing frequencies where refractory effects come into play. For slow forcing the optimal noise is independent of forcing period. We then discuss coherence resonance and stochastic resonance in the quadratic integrate-and-fire model of TypeI dynamics. The presence of a full coherence resonance depends on the interpretation of the model, particularly the boundaries for firing and reset. Our study is motivated by the observation of randomly phase locked firing activity in a large number of neurons, especially those involved in transducing physical stimuli such as temperature, sound, pressure and electric fields, but also in central neurons involved in the generation of various rhythms.
Keywords: Neuron models, noise, FitzHugh-Nagumo, quadratic integrate-and-fire model, stochastic resonance, coherence resonance, sensory processing, power spectrum, type I model, type II model. - Ch 5: Noisy oscillators
Author Bard Ermentrout:
Synchronous oscillations occur throughout the nervous system. Coupling between rhythmic systems is known to induce synchrony. However, another way to produce synchrony is through correlated inputs to uncoupled oscillators. In this chhapter, we explore the role of correlated noise in synchronizing neural oscillations, a phenomena called stochastic synchronization. Our motivation is the olfactory bulb and in this chapter experiment and theory are combined to illustrate how features of the noise and oscillators affect synchronization. We extensively use the phase-resetting curve (PRC) of the oscillator. We also illustrate how noise affects the shape and variance of the PRC.
Key words: oscillators, noise, stochastic-synchronization, olfactory bulb, phase resetting curve
- Ch 6: The role of noise in networks of noisy neurons
Brent Doiron
In many brain areas neural response are significantly variable across repeated presentations of a stimulus. Typically, response variability limits coding performance, however we discuss various examples where stimulus independent fluctuations serve to enhance neural coding. We focus first on the impact of single cell variability, and second on correlated variability across pairs of cells in a population. In both cases the threshold nonlinearity inherent in spike production produces unexpected relationships between input and output statistics, often that have potential advantages to neural function.
Keywords: neural variability, integrate-and-fire neuron, spike correlation, neural coding -
Ch 7: Population Density Methods in Large-Scale Neural Network Modeling
Author Daniel Tranchina:
Population density methods have a rich history in theoretical and computational neuroscience. In the earlier years, these methods were used in large part to study the statistics of spike trains \shortcite{TU2,wilburrinzel83}. Starting in the 1990's \shortcite{kuramoto,abbottvv}, population density function (PDF) methods have been used as an analytical and computational tool to study neural network dynamics. In this chapter, we discuss the motivation and theory underlying PDF methods and a few selected examples of computational and analytical applications in neural network modeling.
Keywords: Synaptic noise, stochastic spike trains, Poisson process, integrate-and-fire, random differential equation, state space,sparse connectivity, partial differential-integral equation, Fokker-Plank equation, visual cortex. -
Ch 8: A population density model of the driven LGN/PGN
Authors
Gregory D. Smith
with Marco Huertas
The interaction of two populations of integrate-and-fire-or-burst neurons representing thalamocortical cells from the dorsal lateral geniculate nucleus (dLGN) and thalamic reticular cells from the perigeniculate nucleus (PGN) is studied using a population density approach. A two-dimensional probability density function that evolves according to a time-dependent advection-reaction equation gives the distribution of cells in each population over the membrane potential and de-inactivation level of a low-threshold calcium current. In the absence of retinal drive, the population density network model exhibits rhythmic bursting. In the presence of constant retinal input, the aroused LGN/PGN population density model displays a wide range of responses depending on cellular parameters and network connectivity.
Keywords: Population density model,Dorsal lateral geniculate nucleus Perigeniculate nucleus, Thalamocortical relay neuron, Thalamic reticular neuron, Burst, Tonic, Vision. -
Ch 9: Synaptic ``noise'': Experiments, computational consequences
and methods to analyze experimental data
Authors Alain Destexhe with Michelle Rudolph-Lilith:
In the cerebral cortex of awake animals, neurons are subject to tremendous fluctuating activity, mostly of synaptic origin, termed ``synaptic noise''. Synaptic noise is the dominant source of membrane potential fluctuations in neurons and can have a strong influence on their integrative properties. We review here the experimental measurements of synaptic noise, and its modeling by conductance-based stochastic processes. We then review the consequences of synaptic noise on neuronal integrative properties, as predicted by computational models and investigated experimentally using the dynamic clamp. We also review analysis methods such as spike-triggered average or conductance analysis, which are derived from the modeling of synaptic noise by stochastic processes. These different approaches aim at understanding the integrative properties of neocortical neurons in the intact brain.
Keywords: Conductances, Dynamic-clamp, Computational models, Cerebral cortex, in vivo, computational consequences of noise - Ch 10: Statistical models of spike trains
AuthorsLiam Paninski, Emery Brown, Satish Iyengar and Robert E Kass:
Spiking neurons make inviting targets for analytical methods based on stochastic processes: spike trains carry information in their temporal patterning, yet they are often highly irregular across time and across experimental replications. The bulk of this volume is devoted to mathematical and biophysical models useful in understanding neurophysiological processes. In this chapter we consider statistical models for analyzing spike train data. We focus on the stochastic integrate-and-fire neuron as a particularly useful model, which may be approached analytically in three distinct ways: via the language of 1) stochastic (diffusion) processes, 2) hidden Markov (state-space) models, and 3) point processes. Each of these viewpoints comes equipped with its own specialized tools and insights, and the power of the IF model is most evident when all of these tools may be brought to bear simultaneously.
Keywords: Fokker-Planck equation; integrate-and-fire; state-space model; renewal process; diffusion model; inverse Gaussian; first passage time; spike-triggered average - Ch 11: Stochastic simulation of neurons, axons and action potentials
Author A. Aldo Faisal: Variability is inherent in neurons. To account for variability we have to make use of stochastic models. We will take a look at this biologically more rigorous approach by studying the fundamental signal of our brain's neurons: the action potential and the voltage-gated ion channels mediating it.We will discuss how to model and simulate the action potential stochastically. We review the methods and show that classic stochastic approximation methods fail at capturing important properties of the highly non-linear action potential mechanism - making the use of accurate models and simulation methods essential for understanding the neural code. We will review what stochastic modelling has taught us about the function, structure and limits of action potential signalling in neurons. The most surprising insight being that stochastic effects of individual signalling molecules become relevant for whole cell behaviour. We suggest that most of the experimentally observed neuronal variability can be explained from the bottom-up as generated by molecular sources of thermodynamic noise.
Keywords: Action potential, spike, axon, nerve fiber, Stochastic simulation, noise, voltage-gated ion channel, Na channel, limits, spike time reliability, neuronal variability. - Ch 12: Numerical Simulations of SDEs and SPDEs from Neural Systems using SDElab
Hasan Alzubaidi, Hagen Gilsing and Tony Shardlow:
Stochastic differential equations are an important class of models that allow for a time varying random forcing in standard deterministic differential equations. We introduce the Ito stochastic differential equation as a generalisation of the standard finite dimensional initial value problem for ODEs. The Hodgkin-Huxley model is given as example. We also look at reaction-diffusion equations, in particular the FitzHugh Nagumo model, under the influence of stochastic forcing. Examples are given in the computer environment MATLAB.
Keywords: Hodgkin-Huxley, FitzHugh Nagumo, stochastic differential equation, Ito calculus, stochastic PDEs, numerical solution of initial value problems.