syllabus // neural computation
See the schedule for topics by week and links to slides.
Applied Mathematics III: Neural Computation
This is a draft syllabus, subject to change.
Goals: Students completing this course should be able to deploy modern methods in computational inference and feedforward and recurrent neural networks, and develop familiarity with underlying theory and assumptions. The students will be able to design and train efficiently neural networks using supervised, unsupervised and reinforcement learning algorithm.
Topics: information theory, statistical inference, generative models, mean field theory, feedforward and recurrent neural networks, supervised/unsupervised
Prerequisites
Calculus, linear algebra, probability, python, as presented in the Fall and Winter courses. The course will include recap of information theory and statistical mechanics.
Schedule overview:
- Week 1: Recap of probability theory. Statistical inference.
- Week 2-4: Supervised learning. Error backpropagation.
- Weeks 5: Unsupervised learning.
- Weeks 6-8: Stat mech. Hopfield network. Boltzmann machines.
- Weeks 9-10: Backpropagation through time. Reinforcement learning.
Homework and assessment:
Weekly homeworks will consist of a mix of pen-and-paper problems and computational exercises. Computational homework will be turned in as jupyter notebooks, allowing integration of programming, simulation results, and LaTeX into a single document. There will be a final exam, which will be taken in-class, on paper - we will assess understanding of computational methods by asking for written descriptions of algorithms (for instance).
Software
Books
We will not follow a single text, but students who would like additional reading might find these useful:
- Hertz, Krogh & Palmer, Introduction to the theory of neural computation.
- David MacKay, Information Theory, Inference, and Learning Algorithms.
- Mezard, Parisi & Virasoro, Spin glass theory and beyond. Classic statistical mechanics book for the theory of spin glasses and the Hopfield network.
- Abbott & Dayan, Theoretical neuroscience.
- Dalvit et al., Problems on statistical mechanics.
Lecture notes:
Exhaustive lecture notes for the class are available here.
Inclusion and accessibility
Please tell us your preferred pronouns and/or name, especially if it differs from the class roster. We take seriously our responsibility to create inclusive learning environments. Please notify us if there are aspects of the instruction or design of this course that result in barriers to your participation! You are also encouraged to contact the Accessible Education Center in 164 Oregon Hall at 541-346-1155 or uoaec@uoregon.edu.
We are committed to making our classroom an inclusive and respectful learning space. Being respectful includes using preferred pronouns for your classmates. Your classmates come from a diverse set of backgrounds and experiences; please avoid assumptions or stereotypes, and aim for inclusivity. Let us know if there are classroom dynamics that impede your (or someone else’s) full engagement.
Please see this page for more information on campus resources, academic integrity, discrimination, and harassment (and reporting of it).