Theory Lunch Seminar - Shyamal Patel
— 1:00pm
Location:
In Person
-
Gates Hillman 8102
Speaker:
SHYAMAL PATEL,
Ph.D. StudentTheory of Computation GroupColumbia University
https://sites.google.com/view/shyamal-patel/home
A fundamental task in learning theory is to understand the computational complexity of learning k-term DNFs in the distribution-free PAC learning model. Unfortunately, given only random examples, a variety of lower bounds suggest that learning DNFs with a superconstant number of terms requires superpolynomial time. That said, work by Blum and Rudich showed that such a bound can be circumvented if we have query access to the function, proving k-term DNFs can be learned in time n · 2k in this setting.
In this talk, we’ll describe an algorithm to learn k-term DNFs in time n · 2√k. To do so, we construct an adaptive set of features that allow us to represent the function as the sign of a low-degree polynomial over these features. We then use tools from junta testing and attribute-efficient learning to effectively reduce the number of variables and achieve our result.
Based on joint work with Josh Alman, Shivam Nadimpalli, and Rocco Servedio
For More Information:
hfleisch@andrew.cmu.edu