Photo cred to Cynthia Guo

Max Hopkins

EBU-3B
La Jolla, CA, 92092
Office 4232

nmhopkin @ eng dot ucsd dot edu

About

I am a PhD student in the theory group at UCSD where I am thankful to be advised by Daniel Kane and Shachar Lovett. I did my undergraduate BA in Mathematics at Harvard University with a minor in Computer Science. In my time there, I was lucky enough to work under Michael Mitzenmacher and Madhu Sudan.

My research is generously supported by NSF GRFP, ARCS, and JSOE fellowships.

I host the weekly UCSD Theory Lunch on Fridays. Past and upcoming talks can be found here.

I like singing, boardgames, squash, and sushi (in no particular order).

Interests

I am broadly interested in areas such as learning theory, computational geometry, combinatorics, approximation algorithms, and hardness of approximation. Currently, I mostly think about:

  1. High Dimensional Expanders
  2. High Dimensional Expansion (HDX) is an emerging area with applications across topics such as sampling, error correction, approximation algorithms, and more. I study analysis of boolean functions and random walks on HDX, especially in the context of approximation algorithms and unique games. In Spring 2021, Shachar Lovett and I co-taught an introductory course on expanders and HDX. Detailed course notes can be found here, and corresponding lectures here.

  3. Learning beyond Uniform Convergence
  4. Many classical results in learning theory rely on the idea of uniform convergence, that empirical risk should converge to true risk across all hypotheses simultaneously. Unfortunately, this paradigm fails in more practical settings beyond basic worst-case analysis. I am interested in studying learnability in more realistic settings like learning with arbitrary distributional assumptions where these traditional techniques fail.

  5. Active Learning through Enriched Queries
  6. When does adaptivity help us learn? In standard models, it is well known that adaptivity often provides no asymptotic benefit. I study how asking better questions allows us to build adaptive algorithms that drastically speed up learning. If you're interested in this area, check out this blog post for a basic primer.

Papers*

Conference Papers

  1. High Dimensional Expanders: Eigenstripping, Pseudorandomness, and Unique Games
    Mitali Bafna, Max Hopkins, Tali Kaufman, Shachar Lovett
    SODA 2022

  2. Active Learning with Bounded Memory through Enriched Queries
    Max Hopkins, Daniel Kane, Shachar Lovett, Michal Moshkovitz
    COLT 2021

  3. The Power of Comparisons for Actively Learning Linear Separators
    Max Hopkins, Daniel Kane, Shachar Lovett
    NeurIPS 2020

  4. Point Location and Active Learning: Learning Halfspaces Almost Optimally
    Max Hopkins, Daniel Kane, Shachar Lovett, Gaurav Mahajan
    FOCS 2020

  5. Noise Tolerant, Reliable Active Classification with Comparison Queries
    Max Hopkins, Daniel Kane, Shachar Lovett, Gaurav Mahajan
    COLT 2020

  6. Doppelgangers: the Ur-Operation and Posets of Bounded Height
    Thomas Browning, Max Hopkins, Zander Kelley
    FPSAC 2018 (Extended Abstract)

  7. Simulated Annealing for JPEG Quantization
    Max Hopkins, Michael Mitzenmacher, Sebastian Wagner-Carena
    DCC 2018 (poster)
    Code

Journal Papers

  1. A Novel CMB Component Separation Method: Hierarchical Generalized Morphological Component Analysis
    Sebastian Wagner-Carena, Max Hopkins, Ana Rivero, Cora Dvorkin
    MNRAS, May 2020
    Code

Pre-prints

  1. Realizable Learning is All You Need
    Max Hopkins, Daniel Kane, Shachar Lovett, Gaurav Mahajan
    In Submission

  2. Hypercontractivity on High Dimensional Expanders
    Mitali Bafna, Max Hopkins, Tali Kaufman, Shachar Lovett
    In Submission

Manuscripts

  1. On the ℓ2-Structure of Expanding Posets
    Max Hopkins, Tali Kaufman, Shachar Lovett, Ruizhe Zhang
    In preparation. Companion paper to ``HDX: Eigenstripping...''

  2. Active Learning Polynomial Threshold Functions
    Omri Ben-Eliezer, Max Hopkins, Chutong Yang, Hantao Yu
    In preparation.

Blog Posts

Miscellaneous Writing

* Author order is listed alphabetically (with one exception)

Talks

Upcoming Talks
    High Dimensional Expanders and Unique Games (SODA 2022)
Invited Seminar Talks
  • Realizable Learning is All You Need (UWaterloo)

  • Point Location and Active Learning (Harvard CMSA)

  • High Dimensional Expanders and Unique Games (UW Seminar, SoS+TCS Seminar, Cornell Theory Tea, Chicago/TTIC (Reading Group))

Conference Talks
  • Active Learning in Bounded Memory (COLT '21)

  • Point Location and Active Learning Halfspaces (FOCS '20)

  • Noise Tolerant Active Learning with Comparisons (COLT '20)

UCSD Theory Seminar/Lunch
  • On the Equivalence of Realizable and Agnostic Learning

  • Active Learning with Bounded Memory

  • Point Location and Active Learning

  • An Open Problem in Noisy Sorting

  • Small Set Expansion in the Johnson Graph through High Dimensional Expansion

  • High Dimensional Expanders: The Basics

  • Reasoning in the Presence of Noise

  • The Power of Comparisons for Active Learning

Undergraduate Talks
  • On the Cohomology of Dihedral Groups

  • Understanding Doppelgangers and the Ur-Operation

Teaching

At UCSD
  1. Co-taught CSE 291, Expander graphs and high-dimensional expanders, Spring 2021.

  2. TA for CSE 200, Computability and Complexity, Winter 2020.

At Harvard
  1. TA for APMTH 106, Applied Algebra, Fall 2017.