- 16 novembre 2017: guyễn Lê Thành Dũng (future LIPN)
Réseaux de preuve pour MLL+Mix et algorithmes sur les graphes arêtes-coloriés
Le critère de correction de Danos-Regnier mène à poser le problème
d'algorithmique des graphes suivant : étant donné un graphe apparié,
trouver un cycle (ou un chemin) passant au plus une fois par chaque
paire. Ce problème a été traité en théorie des graphes sous la forme
plus générale de graphes munis d'une coloration sur leurs arêtes
("edge-colored graphs") ; la solution fait intervenir une réduction aux
couplages parfaits et rejoint donc le travail de C. Retoré sur les
"handsome proof-nets".
Partant de cela, on obtient facilement d'une part un critère de
correction pour MLL+Mix en temps linéaire, et d'autre part que le
problème de correction des réseaux MLL+Mix est probablement plus
difficile que celui pour MLL sans la règle Mix (qui est NL-complet). Ce
dernier résultat explique que la littérature sur les critères de
correction pour MLL+Mix soit plus maigre que celle pour MLL.
Nous verrons également d'autres conséquences de ces liens entre logique
linéaire et théorie des graphes, notamment en lien avec le graphe de
dépendances d'un réseau introduit par Bagnol, Doumane et Saurin.
- 15 novembre 2017: Tobias Heindel (DIKU)
Formal language theory beyond trees and forests
The talk presents the theorems of Myhill-Nerode and Chomsky-Schützenberger, replacing trees and words by rewriting diagrams of semi-Thue systems, which are the paradigm example of planar acyclic circuit diagrams (PLACIDs)---the “graphical syntax” of monoidal categories. The talk will focus on the proposal of a definition of recognizable language in monoidal categories, namely sets of arrows that coincide with the inverse image of their direct image under a monoidal functor to a finite monoidal category. For the case of PLACIDs, this class of languages is shown to coincide with the languages of automata in the sense of Bossut, under modest assumptions on gates of circuit diagrams; moreover, the usual notion of recognizable tree language is recovered. The talk presents the Myhill-Nerode theorem as a tool for solving Bossut's open problem of automata complementation. The remainder of the talk describes work in progress and future work, in particular the Chomsky-Schützenberger theorem for PLACIDs.
- 20 octobre 2017 : Pierre Chambart (OCalmPro)
Inlining après l'élimination de fermeture pour OCaml
Usuellement, dans les langages de haut niveau, l'inlining est fait sur un langage intermédiaire proche d'un lambda calcul, où celà revient pour les cas simple à de la beta-réduction. Pour diverses raisons, dans OCaml, nous avons décidé d'introduire un autre langage intermédiaire où les fermetures sont explicite (flambda) sur lequel les optimisations haut niveau sont effectuées. Nous discuterons des avantages et complexités qui viennent avec ce choix.
- 13 octobre 2017 : Marie Kerjean (IRIF)
Smooth models of Linear Logic and Partial differential equations
Differential Linear Logic was constructed following a study of discrete vectorial models of Linear Logic. We want to extend the semantics of Linear Logic in the natural domain of continuous objects and analysis. From the basic fact that Seely's formulas is the direct interpretation of the Kernel Theorem for distributions, we explain two developments :
On one hand we axiomatize a Smooth Differential Linear Logic with a graded syntax where to each solvable Linear partial differential equation one associate an exponential. We construct a model of nuclear Fréchet/ Df spaces for this syntax.
On the other hand, we argue that the interpretation of the $\parr$ as Schwartz's epsilon product should be the cornestone of the construction of a smooth classical model of DiLL. From a first-model of $k$-reflexive spaces, and based on pioneering works by Kriegl, Michor and Meise, we construct a variety of (at least two) new models of DiLL. This part is joint work with Y. Dabrowski.
- 22 septembre 2017 : Rodolphe Lepigre (LAMA / LSV)
Circular Proofs for Subtyping and Termination
In a recent (submitted) work with Christophe Raffalli, we designed a rich type
system for an extension of System F with subtyping. It includes primitive sums
and products, existential types, and (co-)inductive types. Using a combination
of (pointed) subtyping and circular proofs, we were able to express the system
with typing and subtyping rules that are syntax-directed (up to circularity).
During my talk, I will try to give you a flavour of the techniques we used. In
particular, I will show how choice operators can be used to get rid of typing
contexts, and to allow the commutation of quantifiers with other connectives.
I will then introduce the circular proof framework that is used for handling
inductive and co-inductive types in subtyping rules, and general recursion in
typing rules.
- 30 août 2017 : Tarmo Uustalu (TTÜ Küberneetika Instituut)
COHERENCE FOR SKEW NEAR-SEMIRING CATEGORIES
We consider skew near-semiring categories, a relaxation of near-semiring
categories where the unitors, associators, annihilator and distributor
are not required to be natural isomorphisms, they are just natural
transformations in a particular direction. We prove a coherence theorem
for such categories. The theorem states that, in a free skew
near-semiring category over a set of objects, any two maps between an
object and an object in normal form are equal. Our main motivating
examples for skew near-semiring categories are from programming with
effects. While (relative) monads and lax monoidal functors are the same
as skew monoids in particular skew monoidal categories, (relative)
MonadPlus and Alternative instances are skew near-semiring objects. This
is joint work with Mauro Jaskelioff, Exequiel Rivas and Niccolò Veltri.
- 7 juillet 2017 : Niccolò Veltri (TTÜ Küberneetika Instituut)
PARTIALITY AND CONTAINER MONADS
We investigate monads of partiality in Martin-Löf type theory,
following Moggi?s general monad-based method for modelling effectful
computations. These monads are often called lifting monads and appear in
category theory with different but related definitions. In this talk, we
unveil the relation between containers and lifting monads. We show that
the lifting monads usually employed in type theory can be specified in
terms of containers. Moreover, we give a precise characterization of
containers whose interpretations carry a lifting monad structure. We
show that these conditions are tightly connected with Rosolini?s notion
of dominance. We provide several examples, putting particular emphasis
on Capretta?s delay monad and its quotiented variant, the
non-termination monad.
- 30 juin 2017 : Alice Pavot
INDUCTIVE AND FUNCTIONAL TYPES IN LUDICS
Ludics is a logical framework in which types/formulas are modelled by
sets of terms with the same computational behaviour. We investigate the
representation of inductive data types and functional types in ludics.
We study their structure following a game semantics approach. Inductive
types are interpreted as least fixed points, and we prove an internal
completeness result giving an explicit construction for such fixed
points. The interactive properties of the ludics interpretation of
inductive and functional types are then studied. In particular, we
identify which higher-order functions types fail to satisfy type safety,
and we give a computational explanation.
- 23 juin 2017 : Damiano Mazza
PETITE INTRODUCTION À LA LOGIQUE CATÉGORIQUE
On entend dire parfois que le lambda-calcul est "le langage interne des
catégories cartesiennes fermées". Le "langage interne" d'une
catégorie est une notion très générale (mais, à vrai dire, pas tout
à fait formelle) de logique catégorique. Dans cet exposé,
j'introduirai les concepts de base pour comprendre cette notion et
expliquer comment elle est reliée à la sémantique dénotationnelle.
- 16 juin 2017 : Hadrien Batmalle
BAR-RÉCURSION, ANALYSE ET RÉALISABILITÉ CLASSIQUE
La réalisabilité classique est une théorie née des travaux de
Jean-Louis Krivine dans les années 1990, à la frontière entre la
logique et l'informatique théorique. Côté informatique, elle offre un
cadre adapté à l'interprétation calculatoire de preuves classiques.
Côté logique, elle apparaît comme une généralisation prometteuse du
forcing de Cohen. À un modèle du lambda-calcul, on peut ainsi associer
un modèle de la théorie des ensembles ZF.
On s'intéresse ici à des modèles de programmation vérifiant une
propriété de continuité: il peut s'agir de modèles basés sur les
domaines de la sémantique dénotationnelle, ou bien de modèles de
termes d'une version infinitaire du lambda-calcul. Dans ces modèles,
l'instruction 'quote' (qu'on utilise usuellement en réalisabilité
classique pour interpréter l'axiome du choix dépendant) n'est pas
disponible. On utilise donc la technique de la bar-récursion pour
interpréter l'axiome du choix dépendant. Or (en considérant la
réalisabilité classique comme une généralisation du forcing), il
apparaît que la bar-récursion est de plus l'analogue de la condition
de chaîne descendante dans les algèbres de forcing (qui correspond
à une propriété de préservation des réels). On obtient alors que toute
formule de l'analyse vraie dans le modèle ambiant est réalisée dans
ces nouveaux modèles, ce qui nous amène à la question suivante:
quel est le comportement des programmes réalisant des formules de
l'analyse?
- 9 juin 2017 : Valentin Blot
An interpretation of system F through bar recursion
There are two possible computational interpretations of second-order
arithmetic: Girard's system F or Spector's bar recursion and its variants.
While the logic is the same, the programs obtained from these two
interpretations have a fundamentally different computational behavior
and their relationship is not well understood. We make a step towards a
comparison by defining the first translation of system F into a simply-typed
total language with a variant of bar recursion. This translation relies on a
realizability interpretation of second-order arithmetic. Due to Gödel's
incompleteness theorem there is no proof of termination of system F within
second-order arithmetic. However, for each individual term of system F there
is a proof in second-order arithmetic that it terminates, with its realizability
interpretation providing a bound on the number of reduction steps to reach a
normal form. Using this bound, we compute the normal form through primitive
recursion. Moreover, since the normalization proof of system F proceeds by
induction on typing derivations, the translation is compositional. The
flexibility of our method opens the possibility of getting a more direct
translation that will provide an alternative approach to the study of
polymorphism, namely through bar recursion.
- 19 mai 2017 : Jérémy Dubut
Directed homology theories for geometric models of true concurrency
Studying a system through its geometry is the main purpose of directed algebraic topology. This topic emerged in computer science, more particularly in true concurrency, where Pratt introduced his higher dimensional automata (HDA) in 1991 (actually, the idea of geometry of concurrency can be tracked down Dijkstra in 1965). Those automata are geometric by nature: every set of n processes executing independent actions can be modeled by a n-cube, and such an automata then gives rise to a topological space, obtained by glueing such cubes together, with a specific direction of time coming from the execution flow. It then seems natural to use tools from algebraic topology to study those spaces: paths model executions, homotopies of paths, that is continuous deformations of paths, model equivalence of executions modulo scheduling of independent actions, and so on, but all those notions must preserve the direction somehow. This brings many complications and the theory must be done again, essentially from scratch.
In this talk, after developing this idea of geometry of true concurrency, I will focus on homology. Homology is a nice computable tool from algebraic topology and it is a challenge in directed algebraic topology to find a satisfactory analogue that behaves well with direction. I will present our candidate of `directed homology’, called natural (or bimodule) homology. This object consists in a functor with values in modules, which looks at the classical homology of trace spaces (a nice abstraction of what we may call `space of executions’) and how those homologies evolve with time. This evolution can be studied through an abstract notion of bisimulation of functors with values in modules, that has many equivalent characterizations (using relations, using lifting properties, using Grothendieck construction, …) and whose existence is decidable in simple cases. Finally, among nice properties of our directed homology, I will show you that it is computable on simple spaces, which are already enough to model simple truly concurrent systems.
Joint work with Eric Goubault and Jean Goubault-Larrecq.
- 12 mai 2017 : Flavien Breuvart
Monades et comonades (suite)
Dans cette deuxième séance du GdT modades et comonades, je présenterais les monades et comonades gradées. J'insisterais, en particulier, sur ma vision de ces objets comme potentielle piste pour faire interagir interprétation abstraite et typage dans les langages fonctionnels.
- 5 mai 2017 : Thomas Seiler
Why complexity theorists should care about philosoph
Theoretical computer science was somehow born almost a hundred years ago when logicians asked themselves the question: "What is a computable function?". This question, purely theoretical, was answered before the first computer was designed, in the form of the Church-Turing thesis: a computable function is one that can be defined in one of the following equivalent models: recursive functions, Turing machines, or lambda-calculus. The apparition of actual computing devices however made it clear from the start that another question made more sense for practical purposes, namely: "What is an *efficiently* computable function?". This question was tackled by three different work in the span of a single year, marking the birth of computational complexity. Nowadays, computational complexity is an established field: many methods and results have been obtained, and the number of complexity classes grows every year. However, a number of basic open problems remain unanswered, in particular concerning classification of complexity classes. Even worse than that, a number of results ? called barriers ? show that no known method will succeed in producing a new separation result, i.e. show that two classes (e.g. P and NP, or L and P) are disjoint. From a purely theoretical point of view, this lack of methods might be explained by a historic tradition of viewing programs as functions. Once this misconception identified, it points to a lack of adequate foundations for the theory of computation. Fortunately, some recent technical developments may provide a solution to this problem.
- 31 mars 2017 : Jean-Yves Moyen
- 24 mars 2017 : Christophe Fouqueré
Syntaxe transcendentale: seconde lecture.
Suite de la première lecture: JY Girard, avec ses 3 articles sur la "syntaxe transcendentale", aborde la logique sous un angle à la fois philosophico-logique et sous un angle technique, pour aller au-delà de ce qui est fait jusqu'à présent: non seulement la logique linéaire propositionnelle mais encore le traitement de l'égalité donc du premier ordre. Modestement, je commencerai par présenter la partie technique, qui reprend et étend le modèle des réseaux de preuves, en présentant la représentation de MALL et des exponentielles.
- 17 mars 2017 : Micaela Mayero
Des preuves, oui, mais formelles !
- 10 mars 2017 : Arnaud Spiwack
Retrofitting linear types
Type systems based on linear logic and their friends have proved that they are both capable of expressing a wealth of interesting abstractions. Among these the ability to mix garbage-collected and explicitly managed data in the same language. This is of prime interest for distributed computations that need to reduce latency induced by GC pauses: a smaller GC heap is a happier GC heap. I had always had the belief that to add linear types to a language was a rather intrusive procedure and that a language with linear types would basically be an entirely new language. The Rust language is a case in point: it has a linear-like type system, but it's a very different language from your run-of-the-mill functional language. This turns out not to be the case: this talk presents a way to extend a functional programming language. We are targeting Haskell but there is little specific to Haskell in this presentation. I will focus mostly on the type system and how it can be generalised: among other things the type system extends to dependent types.
- 3 mars 2017 : Luc Pellissier
Introduction à la théorie de la complexité géométrique, d'après K. Mulmuley
La théorie de la complexité géométrique est un programme de recherche porté par Ketan Mulmuley, qui vise à résoudre des questions de complexité après les avoir traduites comme des inclusions de surfaces algébriques représentant des groupes de symétries. Après avoir présenté la théorie avec beaucoup de recul, on présentera un résultat de séparation obtenu ainsi (entre P et une classe ad hoc).