Modeling Neural Circuits Made Simple with Python

An accessible undergraduate textbook in computational neuroscience that provides an introduction to the mathematical and computational modeling of neurons and networks of neurons.

Understanding the brain is a major frontier of modern science. Given the complexity of neural circuits, advancing that understanding requires mathematical and computational approaches. This accessible undergraduate textbook in computational neuroscience provides an introduction to the mathematical and computational modeling of neurons and networks of neurons. Starting with the biophysics of single neurons, Robert Rosenbaum incrementally builds to explanations of neural coding, learning, and the relationship between biological and artificial neural networks. Examples with real neural data demonstrate how computational models can be used to understand phenomena observed in neural recordings. Based on years of classroom experience, the material has been carefully streamlined to provide all the content needed to build a foundation for modeling neural circuits in a one-semester course.

  • Proven in the classroom
  • Example-rich, student-friendly approach
  • Includes Python code and a mathematical appendix reviewing the requisite background in calculus, linear algebra, and probability
  • Ideal for engineering, science, and mathematics majors and for self-study
Robert Rosenbaum is Associate Professor of Applied and Computational Mathematics and Statistics at the University of Notre Dame. His research in computational neuroscience is focused on using computational models of neural circuits to help understand the dynamics and statistics of neural activity underlying sensory processing and learning.
Contents
List of Figures ix
Preface xi

1 Modeling Single Neurons 1
1.1 The Leaky Integrator Model 1
1.2 The EIF Model 5
1.3 Modeling Synapses 10

2 Measuring and Modeling Neural Variability 15
2.1 Spike Train Variability, Firing Rates, and Tuning 15
2.2 Modeling Spike Train Variability with Poisson Processes 21
2.3 Modeling a Neuron with Noisy Synaptic Input 25

3 Modeling Networks of Neurons 33
3.1 Feedforward Spiking Networks and Their Mean-Field Approximation 33
3.2 Recurrent Spiking Networks and Their Mean-Field Approximation 37
3.3 Modeling Surround Suppression with Rate Network Models 43

4 Modeling Plasticity and Learning 49
4.1 Synaptic Plasticity 49
4.2 Feedforward Artificial Neural Networks 54

Appendix A: Mathematical Background 61
A.1 Introduction to ODEs 61
A.2 Exponential Decay as a Linear, Autonomous ODE 63
A.3 Convolutions 65
A.4 One-Dimensional Linear ODEs with Time-Dependent Forcing 69
A.5 The Forward Euler Method 71
A.6 Fixed Points, Stability, and Bifurcations in One-Dimensional ODEs 74
A.7 Dirac Delta Functions 78
A.8 Fixed Points, Stability, and Bifurcations in Systems of ODEs 81

Appendix B: Additional Models and Concepts 89
B.1 Ion Channel Currents and the HH Model 89
B.2 Other Simplified Models of Single Neurons 97
B.3 Conductance-Based Synapse Models 113

B.4 Neural Coding 115
B.5 Derivations and Alternative Formulations of Rate Network Models 124
B.6 Hopfield Networks 127
B.7 Training Readouts from Chaotic RNNs 131
B.8 DNNs and Backpropagation 136

References 141
Index 147

About

An accessible undergraduate textbook in computational neuroscience that provides an introduction to the mathematical and computational modeling of neurons and networks of neurons.

Understanding the brain is a major frontier of modern science. Given the complexity of neural circuits, advancing that understanding requires mathematical and computational approaches. This accessible undergraduate textbook in computational neuroscience provides an introduction to the mathematical and computational modeling of neurons and networks of neurons. Starting with the biophysics of single neurons, Robert Rosenbaum incrementally builds to explanations of neural coding, learning, and the relationship between biological and artificial neural networks. Examples with real neural data demonstrate how computational models can be used to understand phenomena observed in neural recordings. Based on years of classroom experience, the material has been carefully streamlined to provide all the content needed to build a foundation for modeling neural circuits in a one-semester course.

  • Proven in the classroom
  • Example-rich, student-friendly approach
  • Includes Python code and a mathematical appendix reviewing the requisite background in calculus, linear algebra, and probability
  • Ideal for engineering, science, and mathematics majors and for self-study

Author

Robert Rosenbaum is Associate Professor of Applied and Computational Mathematics and Statistics at the University of Notre Dame. His research in computational neuroscience is focused on using computational models of neural circuits to help understand the dynamics and statistics of neural activity underlying sensory processing and learning.

Table of Contents

Contents
List of Figures ix
Preface xi

1 Modeling Single Neurons 1
1.1 The Leaky Integrator Model 1
1.2 The EIF Model 5
1.3 Modeling Synapses 10

2 Measuring and Modeling Neural Variability 15
2.1 Spike Train Variability, Firing Rates, and Tuning 15
2.2 Modeling Spike Train Variability with Poisson Processes 21
2.3 Modeling a Neuron with Noisy Synaptic Input 25

3 Modeling Networks of Neurons 33
3.1 Feedforward Spiking Networks and Their Mean-Field Approximation 33
3.2 Recurrent Spiking Networks and Their Mean-Field Approximation 37
3.3 Modeling Surround Suppression with Rate Network Models 43

4 Modeling Plasticity and Learning 49
4.1 Synaptic Plasticity 49
4.2 Feedforward Artificial Neural Networks 54

Appendix A: Mathematical Background 61
A.1 Introduction to ODEs 61
A.2 Exponential Decay as a Linear, Autonomous ODE 63
A.3 Convolutions 65
A.4 One-Dimensional Linear ODEs with Time-Dependent Forcing 69
A.5 The Forward Euler Method 71
A.6 Fixed Points, Stability, and Bifurcations in One-Dimensional ODEs 74
A.7 Dirac Delta Functions 78
A.8 Fixed Points, Stability, and Bifurcations in Systems of ODEs 81

Appendix B: Additional Models and Concepts 89
B.1 Ion Channel Currents and the HH Model 89
B.2 Other Simplified Models of Single Neurons 97
B.3 Conductance-Based Synapse Models 113

B.4 Neural Coding 115
B.5 Derivations and Alternative Formulations of Rate Network Models 124
B.6 Hopfield Networks 127
B.7 Training Readouts from Chaotic RNNs 131
B.8 DNNs and Backpropagation 136

References 141
Index 147