Theory Lunch Seminar - David Wu October 23, 2024 12:00pm — 1:00pm Location: In Person - ASA Conference Room, Gates Hillman 6115 Speaker: DAVID WU , Ph.D. Student, Electrical Engineering and Computer Sciences, University of California, Berkeley https://davidxwu.github.io/ Locally stationary distributions: inference and optimization beyond rapid mixing Markov chain sampling algorithms such as MCMC are an important algorithmic primitive for inference, optimization, and statistical estimation in high-dimensional settings. To guarantee downstream performance, it suffices for MCMC to converge, or mix to a target distribution. Unfortunately, in many relevant scenarios, the mixing time of canonical MCMC algorithms is exponential in the dimension, rendering this theoretical guarantee impractical. In this talk, we study Markov chains with potentially slow mixing times, based on an analogy to stationary points in optimization. In particular, under mild assumptions, any reversible Markov chain converges to a “locally stationary” distribution in polynomially many steps. Such locally stationary distributions enjoy various properties that make them amenable to analysis in interesting inference and optimization settings. Using this framework, we prove that canonical MCMC algorithms with exponential mixing time can nevertheless recover (1) large independent sets in triangle free graphs and (2) good estimates of the hidden communities in a 2-community stochastic block model. Our analysis reveals a connection between a certain MCMC algorithm and the ubiquitous power iteration for spiked matrix models. Based on joint work with Kuikui Liu (MIT), Sidhanth Mohanty (MIT), Amit Rajaraman (MIT), and Prasad Raghavendra (UC Berkeley) (arXiv:2405.20849) Event Website: https://www.cs.cmu.edu/~theorylunch/ Add event to Google Add event to iCal