| Preface | p. xiii |
| I Neural Encoding and Decoding | p. 1 |
| 1 Neural Encoding I: Firing Rates and Spike Statistics | p. 3 |
| 1.1 Introduction | p. 3 |
| 1.2 Spike Trains and Firing Rates | p. 8 |
| 1.3 What Makes a Neuron Fire? | p. 17 |
| 1.4 Spike-Train Statistics | p. 24 |
| 1.5 The Neural Code | p. 34 |
| 1.6 Chapter Summary | p. 39 |
| 1.7 Appendices | p. 40 |
| 1.8 Annotated Bibliography | p. 43 |
| 2 Neural Encoding II: Reverse Correlation and Visual Receptive Fields | p. 45 |
| 2.1 Introduction | p. 45 |
| 2.2 Estimating Firing Rates | p. 45 |
| 2.3 Introduction to the Early Visual System | p. 51 |
| 2.4 Reverse-Correlation Methods: Simple Cells | p. 60 |
| 2.5 Static Nonlinearities: Complex Cells | p. 74 |
| 2.6 Receptive Fields in the Retina and LGN | p. 77 |
| 2.7 Constructing V1 Receptive Fields | p. 79 |
| 2.8 Chapter Summary | p. 81 |
| 2.9 Appendices | p. 81 |
| 2.10 Annotated Bibliography | p. 84 |
| 3 Neural Decoding | p. 87 |
| 3.1 Encoding and Decoding | p. 87 |
| 3.2 Discrimination | p. 89 |
| 3.3 Population Decoding | p. 97 |
| 3.4 Spike-Train Decoding | p. 113 |
| 3.5 Chapter Summary | p. 118 |
| 3.6 Appendices | p. 119 |
| 3.7 Annotated Bibliography | p. 122 |
| 4 Information Theory | p. 123 |
| 4.1 Entropy and Mutual Information | p. 123 |
| 4.2 Information and Entropy Maximization | p. 130 |
| 4.3 Entropy and Information for Spike Trains | p. 145 |
| 4.4 Chapter Summary | p. 149 |
| 4.5 Appendix | p. 150 |
| 4.6 Annotated Bibliography | p. 150 |
| II Neurons and Neural Circuits | p. 151 |
| 5 Model Neurons I: Neuroelectronics | p. 153 |
| 5.1 Introduction | p. 153 |
| 5.2 Electrical Properties of Neurons | p. 153 |
| 5.3 Single-Compartment Models | p. 161 |
| 5.4 Integrate-and-Fire Models | p. 162 |
| 5.5 Voltage-Dependent Conductances | p. 166 |
| 5.6 The Hodgkin-Huxley Model | p. 173 |
| 5.7 Modeling Channels | p. 175 |
| 5.8 Synaptic Conductances | p. 178 |
| 5.9 Synapses on Integrate-and-Fire Neurons | p. 188 |
| 5.10 Chapter Summary | p. 191 |
| 5.11 Appendices | p. 191 |
| 5.12 Annotated Bibliography | p. 193 |
| 6 Model Neurons II: Conductances and Morphology | p. 195 |
| 6.1 Levels of Neuron Modeling | p. 195 |
| 6.2 Conductance-Based Models | p. 195 |
| 6.3 The Cable Equation | p. 203 |
| 6.4 Multi-compartment Models | p. 217 |
| 6.5 Chapter Summary | p. 224 |
| 6.6 Appendices | p. 224 |
| 6.7 Annotated Bibliography | p. 228 |
| 7 Network Models | p. 229 |
| 7.1 Introduction | p. 229 |
| 7.2 Firing-Rate Models | p. 231 |
| 7.3 Feedforward Networks | p. 241 |
| 7.4 Recurrent Networks | p. 244 |
| 7.5 Excitatory-Inhibitory Networks | p. 265 |
| 7.6 Stochastic Networks | p. 273 |
| 7.7 Chapter Summary | p. 276 |
| 7.8 Appendix | p. 276 |
| 7.9 Annotated Bibliography | p. 277 |
| III Adaptation and Learning | p. 279 |
| 8 Plasticity and Learning | p. 281 |
| 8.1 Introduction | p. 281 |
| 8.2 Synaptic Plasticity Rules | p. 284 |
| 8.3 Unsupervised Learning | p. 293 |
| 8.4 Supervised Learning | p. 313 |
| 8.5 Chapter Summary | p. 326 |
| 8.6 Appendix | p. 327 |
| 8.7 Annotated Bibliography | p. 328 |
| 9 Classical Conditioning and Reinforcement Learning | p. 331 |
| 9.1 Introduction | p. 331 |
| 9.2 Classical Conditioning | p. 332 |
| 9.3 Static Action Choice | p. 340 |
| 9.4 Sequential Action Choice | p. 346 |
| 9.5 Chapter Summary | p. 354 |
| 9.6 Appendix | p. 355 |
| 9.7 Annotated Bibliography | p. 357 |
| 10 Representational Learning | p. 359 |
| 10.1 Introduction | p. 359 |
| 10.2 Density Estimation | p. 368 |
| 10.3 Causal Models for Density Estimation | p. 373 |
| 10.4 Discussion | p. 389 |
| 10.5 Chapter Summary | p. 394 |
| 10.6 Appendix | p. 395 |
| 10.7 Annotated Bibliography | p. 396 |
| Mathematical Appendix | p. 399 |
| A.1 Linear Algebra | p. 399 |
| A.2 Finding Extrema and Lagrange Multipliers | p. 408 |
| A.3 Differential Equations | p. 410 |
| A.4 Electrical Circuits | p. 413 |
| A.5 Probability Theory | p. 415 |
| A.6 Annotated Bibliography | p. 418 |
| References | p. 419 |