Computer Science Thesis Oral

— 4:00pm

In Person - Reddy Conference Room, Gates Hillman 4405

MICHAEL KUCHNIK , Ph.D. Candidate, Computer Science Department, Carnegie Mellon University

Beyond Model Efficiency: Data Optimizations for Machine Learning Systems

The field of machine learning, particularly deep learning, has witnessed tremendous recent advances due to improvements in algorithms, compute, and datasets. Systems built to support deep learning have primarily targeted computations used to produce the learned model. This thesis proposes to instead focus on the role of data in both training and validation.

For the first part of the thesis, we focus on training data, demonstrating that the data pipeline responsible for training data is a prime target for performance considerations. To aid in addressing performance issues, we introduce a form of data subsampling in the space of data transformations, a reduced fidelity I/O format, and a system for automatically tuning data pipeline performance knobs.

In the second part of the thesis, motivated by the trend toward increasingly large and expressive models, we turn to the validation setting, developing a system for automatically querying and validating a large language model’s behavior with off-the-shelf regular expressions. We conclude with future work in the space of data systems for machine learning.

Thesis Committee:

George Amvrosiadis (Co-chair)
Virginia Smith (Co-chair)
Greg Ganger
Tianqi Chen
Paul Barham (Google)

Add event to Google
Add event to iCal