APPM Special Topics

This is a list of some of the previous syllabi, descriptions, and schedules for courses that were offered in previous semesters:

Courses

Spring 2026 Courses

Data-driven discovery methods are revolutionizing the modeling, prediction, and control of complex systems. These methods reveal governing equations directly from data and their use in the last 6 years has exploded. The class will illustrate methods to integrate modeling and control of dynamical systems with modern methods in data science, machine learning, and computational and applied mathematics. It will also highlight many of the recent advances in scientific computing that enable data-driven methods to be applied to a diverse range of complex systems, such as cell migration, turbulence, the brain, climate, epidemiology, robotics, and autonomy.

Description

 

Prereqs: APPM 5600/5610 or instructor permission

Recommended prereqs: The course will be touching upon topics in numerical linear algebra, probability, PDE, analysis and scientific computing, so prior knowledge and coursework on these topics or interest in learning more about them is helpful. Some programming experience (e.g. Matlab or Python) is also helpful.

 

In many areas of scientific computing, how runtime and storage of our methods for simulation and data analysis "scale" with the number of unknowns n is an important limiting factor: if runtime looks like n^3, we will either need 1000 times more computing power or will spend 1000 times more time (and energy) to add 10 times more unknowns (e.g. to "zoom into" features we care about). In this course, we will learn a number of impactful "fast algorithms" that reduce asymptotic costs of key operations to O(n) or O(n log(n)). We will discuss mathematical and scientific context, numerical analysis, efficient implementation and applications to current research.  Topics include: fast methods for structured linear algebra (sparse and dense), fast transforms (e.g. FFT and non-uniform cousins, wavelets), fast summation methods (Fast Multipole, Ewald), low rank compression of matrices and tensors (with applications to reduced order modeling and data analysis), and randomized numerical methods. Optional topics can be added responding to student interest. 

 

This course will involve several homework assignments and a final project (*both* in small groups); the final project is encouraged to synergize with current or future research interest. 

 

Pre-req. APPM 3570 or equivalent

Description: Random graphs, also called random networks, have been used to understand the robustness of the Internet, study food webs in predatory interactions, and predict unknown metabolic interactions, among countless other applications. This course introduces and analyzes various key random graph models, including the Erdös-Rényi and the Stochastic Block models. It presents these and other topics related to discrete random structures in a coherent and self-contained manner to facilitate their use to model and analyze more general random networks. The course should be especially appealing to undergrad and grad students who seek intuition as well as a mathematical exposition of random graph theory.

Note. This course is different but complementary to Dynamics on Networks (APPM 4/5720) given on the Fall 2019 semester by J. Restrepo.

COURSE PREREQUISITES
• Math: Multi-variable Calculus, Linear Algebra (solid intermediate level, equivalent to APPM 3310), knowledge of
regression methods and statistics.
• Coding: Python (solid intermediate level)
• This will be a fast-paced course geared towards graduate students and advanced undergraduates with a solid
background in math and coding.

 

COURSE DESCRIPTION:
This course provides a hands-on introduction to modern Large Language Models (LLMs) using PyTorch and
TensorFlow. Students will gain practical experience with the technical stack that is essential for developing and
deploying large-scale machine learning models, equipping them with all the necessary skills required in the modern
DS/ML.

The course is structured around two major projects: building a LLM from scratch and fine-tuning an existing LLM using
pre-trained models from open sources e.g. Hugging Face. Through these projects, students will explore the key deep
learning concepts, including but not limited to transformer architectures, attention mechanisms, tokenization,
optimization techniques, and soft fine-tuning.

Students will also develop expertise in model training, performance evaluation, and hyperparameter tuning. The
fine-tuning project will introduce transfer learning techniques, allowing students to adapt powerful pre-trained
models for specific applications while optimizing efficiency and performance of the respective models.

By integrating industry-relevant tools with collaborative learning techniques, this course will prepare students to
tackle real-world challenges in deep learning and natural language processing, setting students for a successful career
in modern DS/ML.
 

Prereqs: APPM 2360 and APPM 3310 or equivalent
Recommended prereqs: APPM 3570/STAT 3100 or equivalent

COURSE DESCRIPTION:

Students will learn mathematical and computational tools that reveal the principles behind biological systems from cells to ecosystems. Students will build and simulate models using dynamical and stochastic systems in order to study foraging, synchronization, pattern formation, and disease spread. Programming exercises in Python/MATLAB will connect theory to data and visualization. This course is ideal for advanced undergraduates and graduate students eager to bridge mathematics and the life sciences.