Princeton-IAS
mh4067 @ princeton dot edu
Hi! I'm a postdoctoral associate of the Princeton theory group ('24-'25) and CSDM at the Institute for Advanced Study ('25-'27). Prior to this I was fortunate to recieve my PhD under Daniel Kane and Shachar Lovett at UCSD, a year of which I was equally fortunate to spend at the Weizmann Institute with Irit Dinur, and as a Research Fellow at the Simons Institute for the Theory of Computation (Summer '23). Even prior to this, I received a BA in mathematics at Harvard University, where I was lucky to work under Michael Mitzenmacher and Madhu Sudan. My research has been generously supported by NSF GRFP, ARCS, and JSOE fellowships.
I like singing, boardgames, squash, and sushi (in no particular order).
I am broadly interested in areas such as computational complexity, algorithms, boolean function analysis, statistical and computational learning theory, algorithmic stability, and computational geometry. Currently, I mostly think about:
High Dimensional Expansion (HDX) is an emerging area in computer science and mathematics with a wide range of applications across complexity, approximate sampling, (quantum) error correction, and more. I study analysis of boolean functions and random walks on HDX, especially in the context of error correction and computational complexity. If you are here looking for introductory material on HDX, see:
I study a broad set of topics within learning including algorithmic stability (e.g. notions like replicability and differential privacy), learning with non-traditional query data (e.g. with comparison data), active learning, and learning beyond traditional uniform convergence guarantees. I am most interested in understanding how such notions relate to underlying combinatorial and geometric structure of data.
From Generative to Episodic: Sample-Efficient Replicable RL
Max Hopkins, Sihan Liu, Christopher Ye, Yuichi Yoshida
Coming soon!
Hypercontractivity on HDX II: Symmetrization and q-Norms
Max Hopkins
STOC 2025 (Lecture Notes)
The Role of Randomness in Stability
Max Hopkins, Shay Moran
ICML 2025, Spotlight
Do PAC-Learners Learn the Marginal Distribution?
Max Hopkins, Daniel Kane, Shachar Lovett, Gaurav Mahajan
ALT 2025
Replicability in High Dimensional Statistics
Max Hopkins, Russell Impagliazzo, Daniel Kane, Sihan Liu, Chris Ye
FOCS 2024
Chernoff Bounds and Reverse Hypercontractivity on HDX
Yotam Dikstein, Max Hopkins
FOCS 2024 (talk)
Stability is Stable: Connections between Replicability, Privacy, and
Adaptive Generalization
Mark Bun, Marco Gaboardi, Max Hopkins, Russell Impagliazzo, Rex Lei, Toniann Pitassi, Satchit Sivakumar, Jess Sorrell
STOC 2023, TPDP 2023 Oral
Invited to JPC Special Issue (declined)
Robust Empirical Risk Minimization with Tolerance
Robi Bhattacharjee, Max Hopkins, Akash Kumar, Hantao Yu, Kamalika Chaudhuri
ALT 2023
Sampling Equilibria: Fast No-Regret Learning in Structured Games
Daniel Beaglehole, Max Hopkins, Daniel Kane, Sihan Liu, Shachar Lovett
SODA 2023
Active Learning Polynomial Threshold Functions
Omri Ben-Eliezer, Max Hopkins, Chutong Yang, Hantao Yu
NeurIPS 2022
Explicit Lower Bounds Against Ω(n)-Rounds of Sum-of-Squares
Max Hopkins, Ting-Chun Lin
FOCS 2022 (talk)
Eigenstripping, Spectral Decay, and Edge-Expansion on Posets
Jason Gaitonde, Max Hopkins, Tali Kaufman, Shachar Lovett, Ruizhe Zhang
RANDOM 2022 (Talk)
Invited to TOC Special Issue
Realizable Learning is All You Need
Max Hopkins, Daniel Kane, Shachar Lovett, Gaurav Mahajan
COLT 2022 (Long Talk, Short Talk), TheoretiCS 2024
Invited Paper to TheoretiCS
Hypercontractivity on High Dimensional Expanders: A Local-to-Global Method for Higher Moments
Mitali Bafna, Max Hopkins, Tali Kaufman, Shachar Lovett
STOC 2022 (Talk)
High Dimensional Expanders: Eigenstripping, Pseudorandomness, and Unique Games
Mitali Bafna, Max Hopkins, Tali Kaufman, Shachar Lovett
SODA 2022
Active Learning with Bounded Memory through Enriched Queries
Max Hopkins, Daniel Kane, Shachar Lovett, Michal Moshkovitz
COLT 2021 (Talk)
The Power of Comparisons for Actively Learning Linear Separators
Max Hopkins, Daniel Kane, Shachar Lovett
NeurIPS 2020
Point Location and Active Learning: Learning Halfspaces Almost Optimally
Max Hopkins, Daniel Kane, Shachar Lovett, Gaurav Mahajan
FOCS 2020 (Long Talk, Short Talk)
Noise Tolerant, Reliable Active Classification with Comparison Queries
Max Hopkins, Daniel Kane, Shachar Lovett, Gaurav Mahajan
COLT 2020 (Talk)
A Novel CMB Component Separation Method: Hierarchical Generalized Morphological Component Analysis
Sebastian Wagner-Carena, Max Hopkins, Ana Rivero, Cora Dvorkin
MNRAS, May 2020
Code
Doppelgangers: the Ur-Operation and Posets of Bounded Height
Thomas Browning, Max Hopkins, Zander Kelley
FPSAC 2018 (Extended Abstract)
Simulated Annealing for JPEG Quantization
Max Hopkins, Michael Mitzenmacher, Sebastian Wagner-Carena
DCC 2018 (poster)
Code
How to Actively Learn in Bounded Memory
A basic exposition of new work towards characterizing bounded memory active learning with a few concrete examples.
The Power of Comparison in Active Learning
An introduction to active learning with comparison queries covering seminal work of KLMZ and recent extensions.
High Dimensional Expanders in Analysis and Computation
PhD Thesis
Representation-Theoretic Techniques for Independence Bounds of Cayley Graphs
Undergraduate Thesis
Notes on Expanders and High Dimensional Expansion
Lecture notes from my course with Shachar Lovett on Expanders and HDX
A Few Tips for Getting into CS Theory Research as an Undergraduate
A couple slides with tricks for entering theory research and links to Summer program opportunities
Bordism Homology
A survey of un-oriented bordism homology
STOC HDX Workshop: June 23
The Geometry of Stable Mean Estimation (MIT Combinatorics Seminar)
Chernoff Bounds on HDX and their Applications (UW Theory Seminar, Cornell Junior Theorists Day, Princeton Theory Lunch, JHU Theory Seminar, IAS CSDM Seminar, ICTS HDX & Codes, CanaDAM)
Explicit SoS Lower Bounds from High Dimensional Expanders (Simons Institute, UCSD Theory Seminar, UToronto Theory Seminar, CMU HDX Reading Group, IAS CSDM Seminar)
Boolean Analysis on HDX (Simons Institute Bootcamp Tutorial)
On High Dimensional Expanders and Hardness of Approximation (IISC Bangalore)
Hypercontractivity on High Dimensional Expanders (STOC '22, TTIC, UMich Theory Seminar, MIT A&C Seminar, Weizmann Guest Lecture, Ben-Gurion Theory Seminar)
Realizable Learning is All You Need (COLT '22, UWaterloo Student Seminar, GaTech Student Seminar, Harvard Student Seminar, Stanford Theory Lunch, Weizmann Theory Lunch, TTIC/Northwestern Junior Theorists Workshop)
High Dimensional Expanders and Unique Games (SODA '22, UW Theory Seminar, SoS+TCS Seminar, Cornell Theory Tea, Chicago/TTIC (Reading Group))
Active Learning in Bounded Memory (COLT '21)
Point Location and Active Learning (FOCS '20, Harvard CMSA)
Noise Tolerant Active Learning with Comparisons (COLT '20)
The Power of Comparisons for Active Learning (UCSD Theory Seminar)
Chernoff Bounds on HDX
Explicit SoS Lower bounds from HDX
On the Equivalence of Realizable and Agnostic Learning
Active Learning with Bounded Memory
Point Location and Active Learning
An Open Problem in Noisy Sorting
Small Set Expansion in the Johnson Graph through High Dimensional Expansion
High Dimensional Expanders: The Basics
Reasoning in the Presence of Noise
On the Cohomology of Dihedral Groups
Understanding Doppelgangers and the Ur-Operation
Co-taught CSE 291, Expander graphs and high-dimensional expanders, Spring 2021.
TA for CSE 200, Computability and Complexity, Winter 2020.
TA for APMTH 106, Applied Algebra, Fall 2017.