- About
- Events
- Calendar
- Graduation Information
- Cornell Learning Machines Seminar
- Student Colloquium
- BOOM
- Spring 2025 Colloquium
- Conway-Walker Lecture Series
- Salton 2024 Lecture Series
- Seminars / Lectures
- Big Red Hacks
- Cornell University / Cornell Tech - High School Programming Workshop and Contest 2025
- Game Design Initiative
- CSMore: The Rising Sophomore Summer Program in Computer Science
- Explore CS Research
- ACSU Research Night
- Cornell Junior Theorists' Workshop 2024
- People
- Courses
- Research
- Undergraduate
- M Eng
- MS
- PhD
- Admissions
- Current Students
- Computer Science Graduate Office Hours
- Advising Guide for Research Students
- Business Card Policy
- Cornell Tech
- Curricular Practical Training
- A & B Exam Scheduling Guidelines
- Fellowship Opportunities
- Field of Computer Science Ph.D. Student Handbook
- Graduate TA Handbook
- Field A Exam Summary Form
- Graduate School Forms
- Instructor / TA Application
- Ph.D. Requirements
- Ph.D. Student Financial Support
- Special Committee Selection
- Travel Funding Opportunities
- Travel Reimbursement Guide
- The Outside Minor Requirement
- Robotics Ph. D. prgram
- Diversity and Inclusion
- Graduation Information
- CS Graduate Minor
- Outreach Opportunities
- Parental Accommodation Policy
- Special Masters
- Student Spotlights
- Contact PhD Office
Title: Efficient Local and Global Causal Discovery: Methods Leveraging Causal Substructures for Improved Finite Sample Performance
Abstract: In this talk, we introduce two complementary methods for local and global causal discovery that leverage causal substructures for improved finite-sample performance. The first method, under an additive noise model (ANM) setting, exploits ancestral relationships to produce a more informative topological ordering than traditional linear orderings, generalizing to nonlinear causal relationships. The second method is a constraint-based approach that focuses on efficiently identifying valid adjustment sets (VAS) for confounding control, without relying on parametric or pretreatment assumptions. Both methods offer theoretical guarantees, run in polynomial time, and are empirically validated on synthetic data. Together, they highlight how harnessing local and global structures can reduce computational overhead, enhance accuracy in identifying causal edges, and improve downstream inference in observational studies.
Bio: Kyra Gan is an assistant professor in the School of Operations Research and Information Engineering and Cornell Tech at Cornell University. She is a field member of ORIE, Center for Applied Mathematics (CAM), and Statistics. She is also affiliated with the WCM Institute of AI for Digital Health. Prior to Cornell Tech, she was a postdoctoral fellow at the Statistical Reinforcement Lab at Harvard University. She received her Ph.D. in Operations Research in 2022 from Carnegie Mellon University.
Her research focuses on the design of adaptive and online algorithms for personalized treatment, such as micro-randomized trials and N-of-1 trials, in constrained settings. She is also interested in robust and efficient statistical inference for data collected in both adaptive and nonadaptive settings, robust and scalable causal discovery methods, and improving fairness in healthcare treatment outcomes.