Monday, May 2, 2022 - 9:30am to 10:30am
Location:In Person Belloch-Skees Conference Room, Gates Hillman 8115
Speaker:LICHEN ZHANG, Masters StudentComputer Science DepartmentCarnegie Mellon University https://www.andrew.cmu.edu/user/lichenz/
Speeding Up Optimizations via Data Structures: Faster Search, Sample and Maintenance
n this thesis, we present novel techniques to solve various fundamental discrete and continuous optimization problems, with the deployment of highly-efficient data structures.
- Sparsification: We obtain fast deterministic and randomized algorithms for spectral sparsification and its variants. We give a deterministic algorithm for constructing linear-sized spectral sparsifier in time O(dω+1) where ω ≈ 2.37 is the exponent of matrix multiplication, breaking the Ω(d4) barrier for all prior deterministic methods.
- Non-Convex Optimization: We present the first algorithm to train deep and over-parametrized neural networks in truly sub-quadratic time. It has a training cost of O(m2.25−α) , where m is the width of the network and α ≈ 0.31 is the dual exponent of matrix multiplication.
The main theme of these major improvements is the novel adaption of data structures in different iterative processes. We show that for different optimization problems, we can frame their iterations as solving certain data structure problems. We design different data structures that are efficient and adaptively robust to realize such speedup. Thesis Committee: Gary Miller (Chair) Anil Ada Zhao Song (Adobe Research) Additional Information