Wednesday, June 29, 2022 - 2:30pm to 4:00pm
Location:Virtual Presentation - ET Remote Access - Zoom
Speaker:TIAN LI, Ph.D. StudentComputer Science DepartmentCarnegie Mellon University https://www.cs.cmu.edu/~litian/
Scalable and Trustworthy Learning in Heterogeneous Networks
Federated learning stands to power next-generation machine learning applications by aggregating a wealth of knowledge from distributed data sources. However, federated networks introduce a number of challenges beyond traditional distributed learning scenarios. In addition to being accurate, federated methods must scale to potentially massive and heterogeneous networks of devices, and must exhibit trustworthy behavior---addressing pragmatic concerns related to issues such as fairness, robustness, and user privacy. In this thesis, we aim to address the practical challenges of federated learning in a principled fashion. We study how heterogeneity lies at the center of the constraints of federated learning---not only affecting the accuracy of the models, but also competing with other critical metrics such as fairness, robustness, and privacy. To address these metrics, we develop new, scalable federated learning objectives and algorithms that rigorously account for and address sources of heterogeneity. Although our work is grounded by the application of federated learning, we show that many of the techniques and fundamental tradeoffs extend well beyond this use-case. Thesis Committee: Virginia Smith (Chair) Tianqi Chen Ameet Talwalkar H. Brendan McMahan (Google Research) Dawn Song (University of California, Berkeley) Zoom Participation. See announcement.