Computer Science Thesis Proposal

Monday, December 6, 2021 - 3:00pm to 4:30pm

Location:

In Person and Virtual Presentation - ET Gates Hillman 8102 and Zoom

Speaker:

AINESH BAKSHI, Ph.D. Student http://aineshbakshi.com/

Analytic Techniques for Robust Algorithm Design

Modern machine learning relies on algorithms that fit expressive models to large datasets. While such tasks are easy in low dimensions, real-world datasets are truly high-dimensional. Additionally, a prerequisite to deploying models in real-world systems is to ensure that their behavior degrades gracefully when the modelling assumptions no longer hold. Therefore, there is a growing need for efficient algorithms that fit reliable and robust models to data.

In this thesis proposal, we focus on designing such efficient, robust and provable algorithms for fundamental tasks in machine learning and statistics. In particular, we investigate two complementary themes arising in this area: high-dimensional robust statistics and fast numerical linear algebra. The first addresses how to fit expressive models to high-dimensional noisy datasets and the second develops fastĀ  algorithmic primitives to reduce dimensionality and de-noise large datasets. We resolve central open questions in robust statistics and randomized linear algebra, and introduce several new algorithmic ideas along the way. Finally, we make the case for analytic techniques, such as convex relaxations, being the natural choice for robust algorithm design.

Thesis Committee:
Pravesh K. Kothari (Co-Chair)
David P. Woodruff (Co-Chair)
Ryan O'Donnell
Boaz Barak (Harvard University)
Santosh S. Vempala (Georgia Institute of Technology)

Additional Information

In Person and Zoom Participation. See announcement.

For More Information, Contact:

Keywords:

Thesis Proposal