Photo cred to Cynthia Guo

Max Hopkins

EBU-3B
La Jolla, CA, 92092
Office 4232

nmhopkin @ eng dot ucsd dot edu

About

I am a third-year PhD student in the theory group at UCSD, where I am advised by Daniel Kane and Shachar Lovett and supported by NSF and Jacobs Fellowships. I did my undergraduate BA in Mathematics at Harvard University with a minor in Computer Science. In my time there, I was lucky enough to work under Michael Mitzenmacher and Madhu Sudan.

I host the weekly UCSD Theory Lunch on Friday's. Past and upcoming talks can be found here.

I like singing, boardgames, squash, and sushi (in no particular order).

Interests

I am broadly interested in areas such as learning theory, computational geometry, combinatorics, approximation algorithms, and hardness of approximation. Currently, I mostly think about:

  1. High Dimensional Expanders and Approximation Algorithms
  2. High Dimensional Expansion is an emerging area with applications to a variety of areas including sampling, error correction, approximation algorithms, and more. I study the spectral and combinatorial structure of random walks on High dimensional expanders and am especially interested in their application to approximation algorithms and unique games. In Spring 2021 I'll be co-teaching a course on expanders and high-dimensional expansion with Shachar Lovett. If you would like to attend any of the lectures, please contact one of us! More info here.

  3. Active Learning through Enriched Queries
  4. When does adaptivity help us learn? In standard models, it is well known that adaptivity often provides no asymptotic benefit. I study how asking better questions allows us to build adaptive algorithms that break these standard lower bounds. If you're interested and/or new to this area, check out this recent blog post for a basic primer, which covers how the ability to compare data can drastically speed up and robustify learning!

Papers

Conference Papers

  1. The Power of Comparisons for Actively Learning Linear Separators
    Max Hopkins, Daniel Kane, Shachar Lovett
    NeurIPS 2020

  2. Point Location and Active Learning: Learning Halfspaces Almost Optimally
    Max Hopkins, Daniel Kane, Shachar Lovett, Gaurav Mahajan
    FOCS 2020

  3. Noise Tolerant, Reliable Active Classification with Comparison Queries
    Max Hopkins, Daniel Kane, Shachar Lovett, Gaurav Mahajan
    COLT 2020

  4. Doppelgangers: the Ur-Operation and Posets of Bounded Height
    Thomas Browning, Max Hopkins, Zander Kelley
    Extended Abstract appeared in Proceedings of FPSAC 2018

  5. Simulated Annealing for JPEG Quantization
    Max Hopkins, Michael Mitzenmacher, Sebastian Wagner-Carena
    DCC 2018 (poster)
    Code

Journal Papers

  1. A Novel CMB Component Separation Method: Hierarchical Generalized Morphological Component Analysis
    Sebastian Wagner-Carena, Max Hopkins, Ana Rivero, Cora Dvorkin
    MNRAS, May 2020

Pre-prints

  1. Active Learning with Bounded Memory through Enriched Queries
    Max Hopkins, Daniel Kane, Shachar Lovett, Michal Moshkovitz
    Submitted

  2. High Dimensional Expanders: Random Walks, Pseudorandomness, and Unique Games
    Max Hopkins, Tali Kaufman, Shachar Lovett
    New version with Mitali Bafna in preparation

Manuscripts

  1. Expanding Posets: Random Walks, Pseudorandomness, and Unique Games
    Max Hopkins, Tali Kaufman, Shachar Lovett
    In preparation. Companion paper to ``HDX: Random Walks...''

Miscellaneous Writing

Talks

Upcoming Talks
Invited Seminar Talks
  • Point Location and Active Learning (Harvard CMSA)

  • High Dimensional Expanders and Unique Games (UW Seminar, SoS+TCS Seminar, Cornell Theory Tea, Chicago/TTIC (Reading Group))

Conference Talks
  • Point Location and Active Learning Halfspaces (FOCS '20)

  • Noise Tolerant Active Learning with Comparisons (COLT '20)

UCSD Theory Seminar or Lunch
  • Active Learning with Bounded Memory

  • Point Location and Active Learning

  • An Open Problem in Noisy Sorting

  • Small Set Expansion in the Johnson Graph through High Dimensional Expansion

  • High Dimensional Expanders: The Basics

  • Reasoning in the Presence of Noise

  • The Power of Comparisons for Active Learning

Undergraduate Talks
  • On the Cohomology of Dihedral Groups

  • Understanding Doppelgangers and the Ur-Operation

Teaching

At UCSD
  1. Co-teaching CSE 291, Expander graphs and high-dimensional expanders, Spring 2021.

  2. TA for CSE 200, Computability and Complexity, Winter 2020.

At Harvard
  1. TA for APMTH 106, Applied Algebra, Fall 2017.