Robustness in Learning and Statistics: Past and Future

August 12-14, 2019
University of California, San Diego

Location: CSE building (EBU3B), room 1242 (map)

Overview:

Robust statistics and related topics offer ways to stress test estimators to the assumptions they are making. It offers insights into what makes some estimators behave well in the face of model misspecification, while others do not. In this summer school, we will revisit classic topics in robust statistics from an algorithmic perspective. We will cover recent progress on provably robust and computationally efficient parameter estimation in high-dimensions. We will compare this to other popular models, like agnostic learning and outlier detection. With the foundations in hand, we will explore modern topics like federated learning, semi-random models and connections to decision theory where being robust is formulated in alternative ways. We hope to have time for discussion about open questions like adversarial examples in deep learning, and invite the audience to help us muse about the right definitions to adopt in the first place.

Target audience:

The school is oriented towards graduate students, postdocs and faculty, who either work in robust statistics or want to understand the questions at the forefront of current research.

Topics:

Schedule:

Full schedule (including slides)

Registration:

Registration is free but is required. The deadline was July 31, 2019. Registration is now closed.

Speakers:




David Donoho
Stanford
Ankur Moitra
MIT
Gregory Valiant
Stanford

Accommodations:

Rates below are per night, do not include tax, and are not guaranteed. When reserving any of the following hotels, please ask for the UCSD rate unless otherwise specified. UCSD rates are usually accommodated upon request but are not guaranteed; it depends on the hotel's availability (the sooner, the better).

Supported by:


Questions? please email Shachar Lovett.