schedule // machine learning and statistics
- Week 1
-
Recap of probability theory: conditional probabilities, Bayes’ theorem with several examples. Statistical inference.
- Week 2
-
Supervised learning: the perceptron. AND, OR and XOR functions. The perceptron learning rule.
- Week 3
-
Gradient descent learning. Learning as inference.
- Week 4
-
Multi-layer networks: XOR function; error back-propagation. Deep learning.
- Week 5
-
Unsupervised learning: Principal Component Analysis, Oja’s rule.
- Week 6
-
Recap of statistical mechanics: ensembles, statistical entropy, Ising model with nearest neighbor interactions.
- Week 7
-
Auto-associative memory and attractor neural networks. Hebbian learning rule, synaptic plasticity. Memory as an Ising model: the Hopfield network. Statistical mechanics of the Hopfield network.
- Week 8
-
Computation of the capacity of the Hopfield network using mean field theory. Absence of spurious retrieval states; phase diagram in the temperature-storage plane. The brain as an anticipating machine: Boltzmann machines, Helmholtz machines learn probability distributions. Wake and sleep learning rule.
- Week 9
-
Backpropagation through time. Reinforcement learning.
- Week 10
-
The temporal credit assignment problem and its solution using temporal difference learning. The actor/critic model.