Quantitative Methods in Neuroscience

The University of Texas at Austin
NEU 466M, Spring 2016
 
lecture: ETC 2.136
time: Tues/Thurs 12:30-2:00 pm
 
computer lab: WEL 2.128
time: Thurs 3:30-5:00 pm
 
instructor: Professor Ila Fiete (fiete [at] mail.clm.utexas.edu)
teaching assistant: Ryota Takaki (rt4.6692016 [at] gmail.com)

office hours:
Dr. Fiete: Thurs 2-3 and Wed 2-2.30 (NHB 3.354) and by appt.
Ryota: Tues 3.30- 4.30 (WEL 2.128) and by appt. (NHB 3.350).
 
syllabus: PDF
MATLAB intros/tutorials:
Mathworks tutorial page
Matlab plotting basics
Intro to summation notation:
Sum notation notes
course schedule
DateTopic Reading/Supp Material Homework
Tues 1.19 Preliminaries, overview.
Thurs 1.21 What is modeling? Compact and predictive descriptions of data. Classification example. Nonlinear regression example: modeling as curve-fitting. Slides Christopher Bishop: Neural Networks for Pattern Recognition, ch 1.
Tues 1.26 Overfitting and cross-validation. Slides
Sample statistics and linear regression. Slides
Thurs 1.28 Time-series: cross- and auto-correlation. Slides Problem set 1
Tues 2.02 Cross- and auto-correlation applied to spike train data. Slides
Thurs 2.04 Spike-triggered average (STA), relationship to cross-correlation. Slides Dayan & Abbott, chapter 1. Bialek et al., Reading the neural code Problem set 2
c1p8.mat
gridcell_halfmsbins.mat
Tues 2.09 STA continued. Slides
Thur 2.11 Convolution and the retina. Slides Problem set 3
c1p8.mat
generate_STAdata.m
Tues 2.16 Edge-detection in the retina. Slides
Introduction to Wiener-Hopf filtering. Slides
Thur 2.18 Finding the least-squares optimal kernel: Wiener-Hopf equations. Linear regression, STA kernels as special cases. Matrix introduction. Slides Problem set 4
plant.mat
generate_STAdata.m
Tues 2.23 Basics of linear algebra I: matrices, vectors, sums, products. Slides Nice set of introductory notes on linear algebra by Z. Kolter.
Thurs 2.25 Basics of linear algebra II: vector spaces, linear independence, basis. Problem set 5
Tues 3.01 Basics of linear algebra III: rank, over- and under-determined problems.
Thurs 3.03 Pseudoinverse for solving over- and under-determined problems (linear regression revisited). No homework (midterm study). Problem set 1-5 solutions
Tues 3.08 Midterm exam (in-class, closed-book).
Thur 3.10 Change-of-basis, eigenvalues and eigenvectors, PCA.
Tues 3.15 No class-- spring break
Thur 3.17 No class-- spring break
Tues 3.22
Thur 3.24
Tues 3.29
Thur 3.31
Tues 4.05
Thur 4.07
Tues 4.12
Thur 4.14
Tues 4.19
Thur 4.21
Tues 4.26
Thur 4.28
Tues 5.03
Thur 5.05 Final exam -- in class.
 
further reading:
  If you leave this class with a bigger appetite for computational neuroscience (and I hope you do!), there are a number of next steps. I recommend reading Theoretical Neuroscience (by Dayan and Abbott; brief surveys of a large number of topics), Spikes (Rieke et al.; a monograph on information in neural activity and codes), Biophysics of Computation (Koch; a monograph focused on single-neuron biophysics), and Analysis of Neural Data (by Brown; the title says it all -- statistical analysis of neural data). These are generally upper-division undergraduate/graduate texts, but are relatively accessible nevertheless.
 
 

page maintained by Ila Fiete and Kijung Yoon