University College London
We present Prognosis, a framework offering automated black-box learning and analysis of models of network protocol implementations. Prognosis can learn models that vary in abstraction level from simple deterministic automata to models containing data operations, such as register updates, and can be used to unlock a variety of analysis techniques — model checking temporal properties, computing differences between models of two implementations of the same protocol, or improving testing via model-based test generation. Prognosis is modular and easily adaptable to different pro- tocols (e.g., TCP and QUIC) and their implementations. We use Prognosis to learn models of (parts of) three QUIC implementations—Quiche (Cloudflare), Google QUIC, and Facebook mvfst—and use these models to analyze the differences between the various implementations. Our analysis provides insights into different design choices and uncovers potential bugs. Concretely, we have found critical bugs in multiple QUIC implementations, which have been acknowledged by the developers. This is joint work with Tiago Ferreira (UCL), Harrison Brewton and Loris D’Antoni (University of Wisconsin–Madison).
Alexandra Silva is a theoretical computer scientist whose main research focuses on semantics of programming languages and modular development of algorithms for computational models. A lot of her work uses the unifying perspective offered by coalgebra, a mathematical framework established in the last decades. Alexandra is currently a Professor of Algebra, Semantics, and Computation at University College London and will be joining Cornell University in the fall of 2021. Previously, she was an assistant professor in Nijmegen and a post-doc at Cornell University, with Prof. Dexter Kozen, and a PhD student at the Dutch national research center for Mathematics and Computer Science (CWI), under the supervision of Prof. Jan Rutten and Dr. Marcello Bonsangue. She was the recipient of an ERC Consolidator Grant in 2020, the Royal Society Wolfson Award 2019, Needham Award 2018, the Presburger Award 2017, the Leverhulme prize 2016, and an ERC starting Grant in 2015
Federal University of Minas Gerais (UFMG), Brazil
Hundreds of programming languages have been designed in the last sixty years, and many of which are still widely used. There is a consensus that constructing nontrivial pieces of software is a very complex endeavor: it involves problem analysis, documentation issues, program design, software implementation, and testing. In addition, in software development processes there is no substitute for experience, simplicity preference, programming expertise, intelligence, and, most of all, the expressive power of the programming language that supports each implementation process. It is also widely accepted that the most effective way of taming programming complexity is through a divide- and-conquer approach, which requires adequate programming notation to represent the software components that naturally arise from this process, which, in general, requires the definition of mappings from real-world entities to program objects. Moreover, in order to simplify the needed programming code, since the early times, each programming language in use reveals a portion of the continuous effort to shorten the semantic gap between the real-world and the programming environment. In particular, most languages have been designed to adequately address specific advances in implementation methodologies and associated development practice.
This talk intends to highlight the most relevant advances in the design of programming language constructs and their associated attempts to model important software development demands. Chronologically, since the creation of Fortran to modern times, it also positions, in a line time, the linguistic responses that have been devised to cope with each methodological demand. The goal of this discussion is to exhibit and explain how some currently available system programming environments were constructed and set up the basis for envisioning and anticipating the next steps in the language design process to meet recent methodological advances.
Roberto S. Bigonha is Professor Emeritus at the Federal University of Minas Gerais (UFMG), Brazil. He worked at the Computer Science Department from 1974 until 2013. He received his Ph.D. in Computing Science in 1981 from the Department of Computer Science of the University of California at Los Angeles, USA. He is the recipient of the 2009 SBC Newton Faller Award for his contribution on the consolidation of the Computer Science area in Brazil. His research interests include automatic compiler generation, formal programming language semantics, object-oriented programming, and software engineering metrics. He has supervised many MSc and PhD students at UFMG.