Computer Science Thesis Oral

— 12:00pm

In Person and Virtual - ET - Traffic21 Classroom, Gates Hillman 6501 and Zoom

CHEN DAN , Ph.D. CandidateComputer Science DepartmentCarnegie Mellon University

Statistical Learning under Adversarial Distribution Shift

One of the most fundamental assumptions in statistical machine learning is that training and testing data should be sampled independently from the same distribution. However, modern real world applications require that the learning algorithm should perform robustly even when this assumption is no longer valid. Specifically, the training and testing distributions may shift slightly (yet adversarially) within a small neighborhood of each other. This formulation encompasses many new challenges in machine learning, including adversarial examples, outlier contaminated data, group fairness and label imbalance. In this thesis, we seek to understand the statistical optimality and provide better algorithms under the aforementioned adversarial distribution shift. Our contributions include (1) the first near optimal minimax lower bound for the sample complexity of adversarially robust classification in a Gaussian setting. (2) introducing the framework of distributional and outlier robust optimization, which allowed us to apply distributionally robust optimization to large scale experiments with deep neural networks and outperformed existing methods in sub-population shift tasks. (3) margin sensitive group risk, a principled way of improving distributional robust generalization via group-asymmetric margin maximization. Thesis Committee: Pradeep Ravikumar (Chair) Zico Kolter Zachary Lipton Avrim Blum (Toyota Technological Institute in Chicago) Yuting Wei (University of Pennsylvania) In Person and Zoom Participation. See announcement.

For More Information:

Add event to Google
Add event to iCal