UCSD CSE 291: Differentiable Programming (Winter 2024)

Course Description

In this course, we will study an emerging field called differentiable programming, which is an interdisciplinary field that combines machine learning, programming languages, numerical computing, signal processing, robotics, computer vision and computer graphics. Recent work on deep learning has revealed the power of derivative-based optimization. Instead of computing gradients of typical feedforward deep models, differentiable programming attempts to differentiate general programs and use them as our machine learning models. This leads to several challenges: The first half of the course will be focusing on building a C-like SIMD-based differentiable programming language. You will step-by-step modify a compiler of a domain-specific language loma (written in Python) and add support for automatic differentiation, while making sure they handle (limited) side effects, control flow, and parallelism (we will generate GPU code!). The second half of the course and the final project will be focusing on applying the programming language for different kinds of tasks, including differentiable simulation, image and signal processing, 3D rendering, etc. Note that this course is different from, though related to, the seminar course CSE 290 on differentiable programming.

Required Knowledge

Python and machine learning. Ideally, you should have basic compiler and parallel programming knowledge, but we do not assume you know.


Instructor: Tzu-Mao Li
TA: Trevor Hedstrom (tjhedstr-at-ucsd.edu)
Lectures: Monday/Wednesday/Friday 10:00am-10:50pm at WLH 2112 (From 4/15, we will move to MANDE B-104)
Instructor office hour: Monday 11am-noon at CSE 4116.
TA office hour: Friday 2pm-3pm at CSE B275.
We will do most of the online discussions on Piazza. All the annoucements will also be made through Piazza, so make sure you sign up.


There will be 3 programming homeworks (20% each) and 1 final project (40%).
Late penalty: score * clamp(1 - (seconds passed after midnight of the deadline day) / 86400, 0, 1) (no late submission for the final project)
We will use the time on Canvas to determine how many seconds have passed.

Homeworks and Projects

The homeworks involve quite a bit of programming and can be tough for the inexperienced. Start early and ask questions! Many of them will be based on the loma compiler.
Homework 0 (not graded): Introduction to the loma Programming Language
Homework 1 (20%): Forward mode automatic differentiation (due 4/15)
Homework 2 (20%): Reverse mode automatic differentiation (due 4/29)
Homework 3 (20%): Handling control flow, function calls, and parallelism (due 5/13)
Final Project (40%): proposal due 5/20, check point due 5/29, final due 6/13.
Collaboration policy: for the homeworks, you need to do it yourself (you are free to discuss between peers). For the final project, you can have a team maximum of 2 people.

Readings (optional)

The book Evaluating Derivatives: Principles and Techniques of Algorithmic Differentiation is the bible of this field.
The Art of Differentiating Computer Programs: An Introduction to Algorithmic Differentiation is also a great book focusing on how to differentiate C code.
The Elements of Differentiable Programming is a recent comprehensive introduction to the topic.

Schedule (very tentative)

4/1/2024 (Mon): Why differentiable programming? [slides]

4/3/2024 (Wed): Introduction to the loma language and compiler. [live demo]

4/5/2024 (Fri): Definition of derivatives. Total derivatives. Forward mode. Dual numbers. [slides]

4/8/2024 (Mon): Implementing forward mode automatic differentiation in loma. [live demo]

4/10/2024 (Wed): Reverse mode automatic differentiation. Computational graph. Transposition of derivative operators. Cheap gradient principle. Backpropagation as a special case of reverse mode. [slides]

4/12/2024 (Fri): Reverse mode (continued). Implementing reverse mode automatic differentiation in loma. [live demo]

4/15/2024 (Mon): Implementing reverse mode automatic differentiation in loma. (HW1 due) [live demo]

4/17/2024 (Wed): Discuss my HW1 implementation. Tracing vs source-to-source transformation. [slides]

4/19/2024 (Fri): Handling loops and function calls. [slides]

4/22/2024 (Mon): Implementing loops and function calls differentiation in loma.

4/24/2024 (Wed): Checkpointing.

4/26/2024 (Fri): Parallel automatic differentiation.

4/29/2024 (Mon): Higher-order derivatives. The Jacobian accumulation problem. Beyond forward/reverse modes. Randomized automatic differentiation. (HW2 due)

5/1/2024 (Wed): Discuss my HW2 implementation. Sparse Jacobian and Hessian.

5/3/2024 (Fri): Deep learning systems (PyTorch, Tensorflow, Jax).

5/6/2024 (Mon): Functional automatic differentiation.

5/8/2024 (Wed): More AD systems: CppAD, Adept, Enzyme, Halide, SLANG.D.

5/10/2024 (Fri): Reversible checkpointing.

5/13/2024 (Mon): Implicit differentiation and differentiable optimization. (HW3 due)

5/15/2024 (Wed): Discuss my HW3 implementation. Differentiating sorting.

5/17/2024 (Fri): Differentiating stochastic programs (e.g., score vs pathwise estimators).

5/20/2024 (Mon): Differentiating integrals with parametric discontinuities. (final project proposal due)

5/22/2024 (Wed): Smoothing and relaxation.

5/24/2024 (Fri): Optimal control and differentiable physics. Neural ODE. Differentiable DSP.

5/27/2024 (Mon): Memorial day. No lecture.

5/29/2024 (Wed): Probabilistic programming languages and Hamiltonian Monte Carlo. (final project checkpoint due)

5/31/2024 (Fri): Equivariant neural networks.

6/3/2024 (Mon): Numerical accuracy.

6/5/2024 (Wed): Pitfall of automatic differentiation.

6/7/2024 (Fri): Discussion: how do we make AD available everywhere?

6/13/2024 (Thu): Final project due.