Computer Science 5th Year Masters Thesis Presentation
— 3:00pm
Location:
Virtual Presentation
-
Remote Access Enabled - Zoom
Speaker:
HANG LIAO
,
Masters Student
https://www.linkedin.com/in/hang-liao
Automatic Differentiation of Sketched Regression
In this work, we explore the possibility of applying sketching in the least square regression (LLS) problem in differentiable programming settings. To motivate automatic differentiation (AD) for systems with sketched regression component, we need to answer the following questions: Do we yield similar derivatives (AD transformations) in differentiable programming systems with LLS and sketched LLS? In practice, does a system containing sketched LLS convergeĀ faster than the same system with LLS in training? How close are the results after convergence? To answer them, we first provide a bound the operator norm of a sketched inverse matrix product, which is useful when analyzing the derivatives of sketched regression. We then give analysis on the approximation errors of derivatives in two purposed ways of sketched regression. Finally, we run experiments on both synthetic and real-world datasets to test the performance of our sketching methods.
Thesis Committee:
David P. Woodruff (Chair)
J. Zico Kolter
Additional Masters Thesis Information
Zoom Participation Enabled. See announcement.
For More Information:
tracyf@cs.cmu.edu