UO Graduate Applied Mathematics
This is the website for the graduate series on applied mathematics at UO, with a focus on the mathematical foundations of machine learning.
Courses:
Goals: proficiency with computational tools and methods; familiarity with properties of common stochastic processes and their uses in modeling and computation; ability to simulate and visualize these
Topics: graphs and networks, experimental mathematics and computation, Brownian motion and Gaussian processes, point processes, diffusions and associated PDE.
Instructors: Peter Ralph (plr@uoregon.edu) and Nicolae Istrate (nistrate@uoregon.edu)
- Winter
Goals: proficiency with foundational approaches in statistical and machine learning; ability to develop and implement these algorithms.
Topics: linear models, classification, kernel methods, mixture models and expectation maximization; inference for sequential data with hidden Markov models and linear dynamical systems.
Instructors: James Murray (jmurray9@uoregon.edu)
- Spring
Goals: ability to deploy modern methods in computational inference and feedforward and recurrent neural networks, and familiarity with underlying theory and assumptions.
Topics: information theory, statistical inference, generative models, mean field theory, feedforward and recurrent neural networks, supervised/unsupervised
Instructors: Luca Mazzucato (lmazzuca@uoregon.edu)
Prerequisites
Familiarity with computer programming equivalent to that in an introductory undergraduate programming course would be assumed. Students not entering with this ability might take such a course in the first year, e.g., through CIS or as an independent study (reading) course, and take this sequence in their second year.
- Useful links, to reference/background material.
- Preparatory classes that you might want to take if you’re not ready for this one.
Course overview
The main goals of this sequence are for students to gain (a) proficiency with some mathematical tools and theories important in applied work; (b) experience with skills and tools of applied work, including modeling, computational methods, programming, and translation/communication. The sequence will not cover an exhaustive list of topics in applied math, but will instead cover a smaller set in greater depth, with the overall goal that students can apply the tools and methods in real-world situations, and have a firm understanding of the underlying theory and associated assumptions.
Working in applied mathematics requires skill in modeling, communication, and computation. Modeling is the act of translating to and from mathematics, and requires a deep understanding of the mathematical objects and the ability to recognize and modify their uses for domain-specific applications. Moreover, modeling requires knowledge of the statistical methods necessary to assess agreement between models and empirical observations. This work similarly requires a strong familiarity with the properties and underlying assumptions of mathematical structures, to be able to communicate effectively: both to translate important problems in the applied field into mathematics, and to explain models, predictions, and inferences in a way that maximizes impact of the mathematical work. Computation is an essential part of most applied work, and requires skills in numerical computation, simulation of stochastic processes, statistical inference, and machine learning.
This sequence aims to teach these skills in the context of three general and interrelated topics of growing importance: combinatorics/computation; modeling/stochastic processes; and machine learning/statistics. Motivation from various applied fields will be used, with an emphasis on biological systems.
Assignments and evaluation
Weekly homeworks would cover both theory (proofs and analytical calculations), application (e.g., modeling exercises), and computation (writing computer code for simulations or data analysis). Final exams would be in-class, without programming but that may for instance involve outlining of algorithms. The qualifying exam would focus on theoretical and conceptual understanding, and would be structured similar to other exams, and would not involve computer programming (but could ask questions about computational techniques).
Software
We will be working extensively with python, using jupyter notebooks for demonstrations and hands-on assignments.
Previous versions
A similar series was taught in 2019-2020, the schedule, along with some exercises and notes, can be found here.
Preparatory classes
If you’re not familiar with programming, we recommend spending some time doing an introductory python module (there are various online; find one that suits you).
A good introductory class to these topics might be “Machine Learning and Statistics”, Math 410/510, taught by Luca Mazzucato. Here’s a short description for that class:
This class is strongly interdisciplinary and aimed at undergraduate and graduate students from Mathematics, Physics, Biology, Computer Science, Psychology, Economics, but open to everybody on campus, including faculty and postdocs. Students taking this class will develop the ability to deploy modern methods in machine learning and computational inference, familiarity with underlying theory and assumptions, and proficiency in practical applications. Covered topics include: information theory, bias/variance tradeoff, statistical inference, neural networks, supervised and unsupervised learning, deep learning. The course will be a mix of theoretical methods and practical simulations run in Python using Jupyter notebooks. Students will work out many examples in full details, with more emphasis on problem solving strategies rather than on the formal constructions and proofs. Course prerequisites include familiarity with computer programming in Python, equivalent to that in an introductory undergraduate programming course. Basic knowledge of Calculus, Linear Algebra, and Probability, are recommended.