- About
- Events
- Calendar
- Graduation Information
- Cornell Learning Machines Seminar
- Student Colloquium
- BOOM
- Spring 2025 Colloquium
- Conway-Walker Lecture Series
- Salton 2024 Lecture Series
- Seminars / Lectures
- Big Red Hacks
- Cornell University / Cornell Tech - High School Programming Workshop and Contest 2025
- Game Design Initiative
- CSMore: The Rising Sophomore Summer Program in Computer Science
- Explore CS Research
- ACSU Research Night
- Cornell Junior Theorists' Workshop 2024
- People
- Courses
- Research
- Undergraduate
- M Eng
- MS
- PhD
- Admissions
- Current Students
- Computer Science Graduate Office Hours
- Advising Guide for Research Students
- Business Card Policy
- Cornell Tech
- Curricular Practical Training
- A & B Exam Scheduling Guidelines
- Fellowship Opportunities
- Field of Computer Science Ph.D. Student Handbook
- Graduate TA Handbook
- Field A Exam Summary Form
- Graduate School Forms
- Instructor / TA Application
- Ph.D. Requirements
- Ph.D. Student Financial Support
- Special Committee Selection
- Travel Funding Opportunities
- Travel Reimbursement Guide
- The Outside Minor Requirement
- Robotics Ph. D. prgram
- Diversity and Inclusion
- Graduation Information
- CS Graduate Minor
- Outreach Opportunities
- Parental Accommodation Policy
- Special Masters
- Student Spotlights
- Contact PhD Office
New Algorithms for Multidimensional Scaling (via Zoom)
Abstract: Multidimensional scaling (MDS) is a nonlinear dimension reduction technique, originating in data visualization. Given a set of positive similarity scores d_ij, one for each pair of elements i,j \in [n], the goal is to find a set of points x1...xn \in R^d minimizing \sum (1 - ||xi - xj|| / d_ij )^2. Heuristic methods to optimize this objective function are widely used in practice (with d = 2 or 3) to create visualizations of graphs and other similarity-based datasets, but the only efficient algorithms with provable guarantees run in time exponential in the aspect ratio Delta = max d_ij / min d_ij. We provide the first efficient algorithms with provable guarantees for large aspect ratios. Along the way, we design and analyze a new rounding scheme for a Sherali-Adams-style linear programming relaxation of the MDS objective.
Joint work with: Ainesh Bakshi (MIT), Vincent Cohen-Addad (Google), Rajesh Jayram (Google), Silvio Lattanzi (Google)
Bio: Sam Hopkins: algorithms, theory of machine learning, semidefinite programming, sum of squares method, bicycles. he/him
"I am a mathematician and computer scientist. I am employed as an Assistant Professor at MIT, in the Theory of Computing group in the Department of Electrical Engineering and Computer Science, where I hold the Jamieson Career Development Chair.
Previously, I was a Miller fellow in the theory group at UC Berkeley, hosted by Prasad Raghavendra and Luca Trevisan. Before that, I got my PhD at Cornell, advised by David Steurer. My papers on Google Scholar and some talks on YouTube. I like bikes."