VS265: Syllabus: Difference between revisions

From RedwoodCenter
Jump to navigationJump to search
Line 51: Line 51:
==== Oct. 14:  Manifold learning ====
==== Oct. 14:  Manifold learning ====


# Local linear embedding and Isomap
# Local linear embedding, Isomap


==== Oct. 16:  Guest lecture ====
==== Oct. 16:  Guest lecture ====

Revision as of 05:41, 1 September 2014

Syllabus

Aug. 28: Introduction

  1. Theory and modeling in neuroscience
  2. Goals of AI/machine learning vs. theoretical neuroscience
  3. Turing vs. neural computation

Sept. 2,4: Neuron models

  1. Membrane equation, compartmental model of a neuron
  2. Linear systems: vectors, matrices, linear neuron models
  3. Perceptron model and linear separability

Sept. 9,11: Guest lectures

  1. TBD
  2. Paul Rhodes, Evolved Machines: Multi-compartment models; dendritic integration

Sept. 16,18: Supervised learning

  1. Perceptron learning rule
  2. Adaptation in linear neurons, Widrow-Hoff rule
  3. Objective functions and gradient descent
  4. Multilayer networks and backpropagation

Sept. 23,25: Unsupervised learning

  1. Linear Hebbian learning and PCA, decorrelation
  2. Winner-take-all networks and clustering

Sept. 30: Guest lecture

  1. TBD

Oct. 2: Sparse, distributed coding

  1. Autoencoders
  2. Natural image statistics
  3. Projection pursuit

Oct. 7: Plasticity and cortical maps

  1. Cortical maps
  2. Self-organizing maps, Kohonen nets
  3. Models of experience dependent learning and cortical reorganization

Oct. 9: Guest lecture

  1. TBD

Oct. 14: Manifold learning

  1. Local linear embedding, Isomap

Oct. 16: Guest lecture

  1. Tom Dean, Google: Connectomics

Oct. 21,23,28,30: Recurrent networks

  1. Hopfield networks
  2. Models of associative memory, pattern completion
  3. Line attractors and `bump circuits’
  4. Dynamical models

Nov. 4,6,13,18,20,25: Probabilistic models and inference

  1. Probability theory and Bayes’ rule
  2. Learning and inference in generative models
  3. The mixture of Gaussians model
  4. Boltzmann machines
  5. Sparse coding and ‘ICA’
  6. Kalman filter model
  7. Energy-based models

Dec. 2,4: Neural implementations

  1. Integrate-and-fire model
  2. Neural encoding and decoding
  3. Limits of precision in neurons
  4. Neural synchrony and phase-based coding

Dec. 9,11: Guest lectures

  1. TBD
  2. TBD