- Research News
- Department Life
- Oral History of Cornell CS
- Department Timeline
- Job Postings
- Ithaca Info
- Internal info
- Graduation Information
- Cornell Tech Colloquium
- Student Colloquium
- CS Colloquium
- SoNIC Workshop
- Conway-Walker Lecture Series
- Salton Lecture Series
- Seminars / Lectures
- Big Red Hacks
- Cornell University High School Programming Contest
- Game Design Initiative
The programming languages research group at Cornell includes eight faculty and over two dozen Ph.D. students. We are proud of both our breadth and depth in this core discipline. Cornell has been known from the beginning for its research in programming languages. We have made foundational contributions to type theory, automated theorem proving, and language semantics. A more recent theme has been language-based solutions to important problems such as computer security, networking, and distributed programming. Cornell researchers have also contributed to language implementation, program analysis and optimization, domain-specific languages, and software engineering.
See the PL group's site for news and a full list of people involved in PL research.
Robert Constable researches programming languages and formal methods in the context of type theory. The Nuprl proof assistant, developed by Constable and his group, is a dependently-typed language that can be used to describe distributed computing, as a formal specification language for computing tasks, and as a theory for formalizing topics in constructive and intuitionistic mathematics (of which classical mathematics can usually be seen as a special case). Constable is also interested in synthesizing programs and concurrent processes from proofs, developing systems that can be shown to be secure by construction, and exploring the deep connections between programming and logic.
Nate Foster works on language design, semantics, and implementation. In the past, he worked on languages and type systems for data processing, including bidirectional languages and data provenance. More recently, he has been developing the Frenetic language, which provides high-level constructs for specifying the behavior of networks. Frenetic makes it possible for programmers to specify the behavior of an entire network using a single program that a compiler translates to low-level code for the underlying devices. This provides opportunities for enforcing security, reliability, and performance guarantees using language-based techniques.
Dexter Kozen has interests that span a variety of topics at the boundary of computer science and mathematics including the design and analysis of algorithms, computational complexity, decision problems in logic and algebra, and logics and semantics of programming languages. Kozen obtained a number of foundational results for Kleene algebras with tests, and developed applications to efficient code certification and compiler verification. Recently he has been investigating capsules, which provide a clean algebraic representation of state in higher-order functional and imperative languages with mutable bindings, and coalgebraic techniques in verification.
Greg Morrisett focuses on the application of programming language technology for building secure, reliable, and high-performance software systems. A common theme is the focus on systems-level languages and tools that can help detect or prevent common vulnerabilities in software. Past examples include typed assembly language, proof-carrying code, software fault isolation, and control-flow isolation. Recently, his research focuses on building provably correct and secure software, including a focus on cryptographic schemes, machine learning, and compilers.
Andrew Myers designs and extends languages to solving cross-cutting problems such as computer security. The Jif language integrates information flow control into Java, and Fabric extends Jif for building secure, decentralized distributed systems. The challenges posed by Jif and Fabric have driven work on language-based methods for controlling timing channels at the software and hardware levels, such as in the SecVerilog hardware description language that was used to design the HyperFlow processor. This work has also motivated work on language support for extension and evolution of large software systems, such as in our Genus and Familia languages.
Adrian Sampson designs hardware–software abstractions. His work on approximate computing pairs new computer architectures with new programming language constructs to let programmers safely trade off small amounts of accuracy for large returns in efficiency. Challenges in approximate programming range from information-flow control for safety to probabilistic program analysis and compiler design. Sampson is curious about new ways to safely give programmers control over system details that are ordinarily hidden from view.
Fred Schneider has leveraged his research on cyber-security by applying programming logics, semantics, and language design. Recently, he has been working on a "logic of belief" for characterizing authorization policies; this approach is now implemented in the Nexus operating system recently developed here at Cornell. Other examples of his recent work include his hyper-properties characterization of kinds of security policies and his proof that program obfuscation (e.g., address space re-ordering) can be as effective as type checking for defending against cyber-attacks.
Ross Tate works on problems related to language design and formalization including type systems, optimizations, and domain-specific extensions. His work draws on fields such as category theory and constructive type theory to develop powerful and flexible solutions. His research is being put into practice through industry collaborations with Red Hat and JetBrains on the design of the Ceylon and Kotlin languages.