Computer Science Thesis Proposal

— 5:30pm

Location:
In Person - Gates Hillman 7501

Speaker:
LUCIO DERY, Ph.D. Student, Computer Science Department, Carnegie Mellon University
https://ldery.github.io/


On Resource Efficient Transfer Learning via End Task Aware Training

Transfer learning is a machine learning paradigm where performance on an end task is improved by exploiting knowledge from other tasks. In most settings of practical concern, machine learning practitioners know in advance what end task they wish to boost with transfer learning. However, widely used methods for leveraging transfer tasks like pre-training and its continued-pretraining variant are end task agnostic: they rarely, if ever, exploit knowledge of the target task.

This proposal takes a step towards resource efficient transfer learning by exploiting advanced knowledge of the end task. To achieve data-efficiency, we propose principled end task aware algorithms for constructing and optimizing over transfer tasks. Next, to produce memory and compute efficient models that still achieve high accuracy on the end task, we introduce new algorithms for end task aware structured pruning of large pre-trained models.

Thesis Committee:

Graham Neubig (Co-Chair)
Ameet Talwalkar (Co-Chair)
Zico Kolter
Luke Zettlemoyer (University of Washington / Meta)
Marc’Aurelio Ranzato (Google Deepmind)


Add event to Google
Add event to iCal