Computer Science Speaking Skills Talk

Tuesday, July 20, 2021 - 2:30pm to 3:30pm


Virtual Presentation - ET Remote Access - Zoom



Exponentially Improved Dimensionality Reduction for l1

Despite many applications, dimensionality reduction in the l1-norm is much less understood than in the Euclidean norm. We design a distribution over r x n random matrices S where r = exp(poly(d, 1/epsilon, 1/delta)), such that given any n x d matrix A, with probability at least 1 - delta, simultaneously for all x, || S A x ||_1 = (1 +- epsilon) || A x ||_1. Note that S is linear, does not depend on A, and maps l1 into l1. Our distribution provides an exponential improvement on the previous best known map of Wang and Woodruff (SODA, 2019), which required r = exp(exp(Omega(d))) even for constant epsilon and delta. Our bound is optimal up to a polynomial factor in the exponent, given a known exp(poly(d)) lower bound for constant epsilon and delta.

Presented in Partial Fulfillment of the CSD Speaking Skills Requirement.

Zoom Participation. See announcement (New link).

For More Information, Contact:


Speaking Skills