SCOT: A Seminar on Semantic and Formal Approaches to Complexity
The SCOT Seminar is devoted to the problem of reasoning on the complexity of programs
in formal and compositional ways. Many approaches have been exploited for that, taking
advantage from logic, category theory, denotational semantics, type systems, interpretations, etc.
This seminar aims at providing a forum of discussion for all issues related to these questions,
from foundational aspects on semantics of complexity to automated time or space complexity
analysis.
The seminar is held on a monthly basis, and virtually.
A mailing list announcing seminars is available.
Please subscribe to it if you are interested in staying informed about the future seminars.
List of Seminars

Tuesday, March 15th 2022, 3:00pm to 4:00pm (CET)
Niki Vazou (IMDEA Madrid). Liquidate your Assets.
Liquid Haskell is an extension of Haskell’s Type system that allows annotating
types with refinement predicates. It’s great for ensuring correctness of your code,
but it can also be used to improve the performance of your code.
If you track your resources then Liquid Haskell can be used to statically bound
the resources needed at runtime, thus statically deciding how performent your
code is. You are liquidating your assets. To track resource we define a
Tick monad that ticks each time a resource (ranging recursive calls to thunks)
is used. Then we use refinement types to statically approximate the number of
ticks that can occur at runtime. This reasoning aids runtime code optimization,
since it can be used to compare resource usage of two different programs.
In this talk I will present this technique through small examples (sorting algorithms
and mapping) and discuss how one can use refinement types to machine check
(basic) complexity bounds.

Tuesday, February 22nd 2022, 3:00pm to 4:00pm (CET)
Gianluca Curzi (University of Birmingham). Cyclic Implicit Complexity.
Circular (or cyclic) proofs have received increasing attention in recent years,
and have been proposed as an alternative setting for studying (co)inductive
reasoning. In particular, now several type systems based on circular reasoning
have been proposed. However, little is known about the complexity theoretic aspects
of circular proofs, which exhibit sophisticated loop structures atypical of more
common 'recursion schemes'. This talk attempts to bridge the gap between circular
proofs and implicit computational complexity. Namely we introduce a circular proof
system based on Bellantoni and Cook's famous safenormal function algebra, and we
identify suitable proof theoretical constraints to characterise the polynomialtime
and elementary computable functions.

Tuesday, January 11th 2022, 3:00pm to 4:00pm (CET)
Alexis Ghyselen (University of Bologna). Sized Types for Parallel Complexity in the PiCalculus.
Type systems as a technique to analyse or control programs have been extensively studied
for functional programming languages. In particular some sized type systems allow
to extract from a typing derivation a complexity bound on the program. In this talk,
we explore how to extend such results to parallel time complexity in the setting of
the picalculus, considered as a communicationbased model for parallel computation.
Two notions of time complexity are studied: the total computation time without
parallelism (the work) and the computation time under maximal parallelism (the span).
We define operational semantics to capture those two notions, and present type systems
from which one can extract a complexity bound on a process. The type systems rely on
sized types and information about the parallel behavior of channels, such as
input/output types, temporal information or usages.

Tuesday, December 7th 2021, 3:00pm to 4:00pm (CET)
Malgorzata Biernacka (University of Wroclav). A Derivational Approach to the Construction of Efficient Abstract Machines.
In this talk I will present a systematic method for the construction of abstract
machines from highlevel semantics implemented with a normalizing function, and
how it can be used to obtain efficient implementations of strong reduction strategies
in the lambda calculus. I will discuss abstract machines for strong call by value
and strong call by need, and how each of them simulates the corresponding reduction
strategy in the number of steps polynomial in the number of betareductions and in the
size of the initial term. This is joint work with Tomasz Drab and Witold Charatonik.

Tuesday, November 8th 2021, 3:00pm to 4:00pm (CET)
Martin Avanzini (INRIA Sophia Antipolis). On ContinuationPassing Transformations and Expected Cost Analysis.
In this talk I will present recent joint work with Ugo Dal Lago and Gilles Barthe, which
is concerned with a novel methodology for the cost analysis of randomized higherorder
programs, i.e., programs which can sample values from chosen distributions during
execution, and at the same time are capable of treating functions as firstclass citizens.
The evaluation of such a program results in a distribution of values, and has an expected
cost, namely the average cost the program experiences along its execution. The way we tackle
such an expected cost analysis is reminiscent of the seminal work of Rosendahl (1989),
and lies in turning the program at hand into a second one, which is structurally quite
similar but computes the cost of execution in addition. This turns an intensional property,
namely the cost of execution, into an extensional one. A crucial aspect of this program
transformation is that probabilistic effects are eliminated along the way, thereby enabling
classical reasoning tools for nonprobabilistic programs in our context. As one such tool we
propose a slight variation of a standard higherorder logic, dubbed EHOL, which we also use to
study some classical examples from probability theory.

Tuesday, October 5th 2021, 3:00pm to 4:00pm (CET)
Isabel Oitavem (Nova University Lisbon). A RecursionTheoretic Characterization of the Probabilistic Class PP.
Probabilistic complexity classes, despite capturing the notion of feasibility, have escaped
any treatment by the tools of socalled implicitcomplexity. Their inherently semantic nature
is of course a barrier to the characterization of classes like BPP or ZPP, but not all
classes are semantic. In this talk, we introduce a recursiontheoretic characterization
of the probabilistic class PP, using recursion schemata with pointers. This is joint work
with Ugo Dal Lago and Reinhard Kahle.

Tuesday, June 22nd 2021, 3:00pm to 4:00pm (CET)
Lars Kristiansen (University of Oslo). Implicit Characterisations of Complexity Classes by Inherently Reversible Programming Languages.
I will introduce a very natural inherently reversible programming
language called RBS (reversible bottomless stack programs). I will explain what I mean by an inherently reversible language.
RBS captures the complexity class ETIME, that is, a problem is in ETIME iff it can be decided by an RBS program. A variant of
RBS, called RBS', captures the complexity class P. I will also,
to the extent time permits, discuss higher order RBS programs.
The talk will not be very technical and should be suitable for
a broad audience. A related conference paper is available:
Kristiansen, Lars: Reversible programming languages, LNCS 12227 (2020), 111 127. doi: 10.1007/9783030524821_6

Tuesday, May 18th 2021, 3:00pm to 4:00pm (CET)
Georg Moser (University of Innsbruck). Automated Analysis of Splaying et al.
Being able to argue about the performance of selfadjusting data structures such as
splay trees has been a main objective, when Sleator and Tarjan introduced the notion
of *amortised* complexity. Analysing these data structures requires sophisticated
potential functions, which typically contain logarithmic expressions. Possibly for
these reasons, and despite the recent progress in automated resource analysis,
they have so far eluded automation. In this talk, I will report on the first
fullyautomated amortised complexity analysis of selfadjusting data structures
and the underlying theory. Following earlier work, the analysis is based on
potential function templates with unknown coefficients. This is joint work
with Lorenz Leutgeb, David Obwaller and Florian Zuleger.

Tuesday, April 13th 2021, 3:00pm to 4:00pm (CET)
Cynthia Kop (Radboud University Nijmegen). Tuple Interpretations for HigherOrder Complexity.
A lot of work has been done in the study of complexity of firstorder term
rewriting systems. However, for higherorder term rewriting, techniques for complexity
analysis are sparse. In this presentation, we will discuss a method based on
interpretations. By mapping all basetype terms to tuples of integers, and terms of a
higher type to functions over tuples, we obtain a powerful technique that not only can be
used to assess runtime complexity, but may also offer some first steps towards a native
complexity notion for higherorder systems.

Tuesday, March 23rd 2021, 3:00pm to 4:00pm (CET)
Olivier Bournez (Ecole Polytechnique). Recursion Schemes, Discrete Differential Equations and Characterization of Polynomial Time Computations.
We will talk about the expressive and computational power of discrete Ordinary
Differential Equations (ODEs), a.k.a. (Ordinary) Difference Equations. We present a new
framework using these equations as a central tool for computation and algorithm design.
We present the general theory of discrete ODEs for computation theory, and we will
illustrate this with various examples of algorithms, and we provide several implicit
characterizations of complexity and computability classes. The proposed framework
presents an original point of view on complexity and computation classes. It unifies
several constructions that have been proposed for characterizing these classes including
classical approaches in implicit complexity using restricted recursion schemes, as well
as recent characterizations of computability and complexity by classes of continuous
ordinary differential equations. It also helps understanding the relationships between
analog computations and classical discrete models of computation theory. At a more
technical point of view, this work points out the fundamental role of linear (discrete)
ODEs and classical ODE tools such as changes of variables to capture computability
and complexity measures, or as a tool for programming many algorithms. This is joint
work with Arnaud Durand.