New Algorithms for Multidimensional Scaling (via Zoom)

Abstract: Multidimensional scaling (MDS) is a nonlinear dimension reduction technique, originating in data visualization. Given a set of positive similarity scores d_ij, one for each pair of elements i,j \in [n], the goal is to find a set of points x1...xn \in R^d minimizing \sum (1 - ||xi - xj|| / d_ij )^2. Heuristic methods to optimize this objective function are widely used in practice (with d = 2 or 3) to create visualizations of graphs and other similarity-based datasets, but the only efficient algorithms with provable guarantees run in time exponential in the aspect ratio Delta = max d_ij / min d_ij. We provide the first efficient algorithms with provable guarantees for large aspect ratios. Along the way, we design and analyze a new rounding scheme for a Sherali-Adams-style linear programming relaxation of the MDS objective.
Joint work with: Ainesh Bakshi (MIT), Vincent Cohen-Addad (Google), Rajesh Jayram (Google), Silvio Lattanzi (Google)

Bio: Sam Hopkins: algorithms, theory of machine learning, semidefinite programming, sum of squares method, bicycles. he/him

"I am a mathematician and computer scientist. I am employed as an Assistant Professor at MIT, in the Theory of Computing group in the Department of Electrical Engineering and Computer Science, where I hold the Jamieson Career Development Chair.
Previously, I was a Miller fellow in the theory group at UC Berkeley, hosted by Prasad Raghavendra and Luca Trevisan. Before that, I got my PhD at Cornell, advised by David Steurer. My papers on Google Scholar and some talks on YouTube. I like bikes."