## Upcoming session: June 15th, 2021 Calendar of all sessions

**
TBD
** ()

## Just another seminar?

The Junior Seminar (« séminaire jeunes chercheurs ») allows
**Inria PhD students / interns / post-docs to present their
work**. The talks are meant to be easily understandable, so
that **anyone can attend**.

The Junior Seminar is a good occasion to discover what the other teams are up to: if you ever wondered what's happening in other Inria's team this seminar is made for you.

## Where? When?

The seminar is usually scheduled every third Tuesday of the month, with a break during the summer. The seminar takes place in the Salle Jacques-Louis Lions in building C, usually at 2pm. Watch out for the posters!

## Format

The seminar is held in English to make sure everyone can attend. There are two 25-minute talks + two 5-minute sessions for questions after each talk. Coffee is provided, as well as snacks.

## I want to speak! I have a student who should speak!

Send us an email at semdoc@inria.fr, and we'll make sure you can give a talk :).

## Who are we?

**Currently in charge of the seminar:**
Denis Merigoux +
Éloïse Berthier

**Formerly in charge of the seminar:**
Alix Chagué +
Armaël Guéneau +
Marie Puren
Léa Boittin +
Patrik Daniel +
Matteo Aletti+
Georgios
Bouloukakis+
Mathieu Feuillet
+ Emanuele Leoncini
+ Jonathan Protzenko
+ Elisa Schenone
+ Pauline Traynard
+ Jacques-Henri Jourdan
.

We're always looking for help, so if you want to help us find new victims and organize rehearsals, do contact us!

Here are the archives of the Junior Seminar.

## May 18th, 2021

**
Antoine Yang
** (Willow)

Recent methods for visual question answering rely on large-scale annotated datasets. Manual annotation of questions and answers for videos, however, is tedious, expensive and prevents scalability. In this work, we propose to avoid manual annotation and generate a large-scale training dataset for video question answering making use of automatic cross-modal supervision. We leverage a question generation transformer trained on text data and use it to generate question-answer pairs from transcribed video narrations. Given narrated videos, we then automatically generate the HowToVQA69M dataset with 69M video-question-answer triplets. We introduce the zero-shot VideoQA task and show that a VideoQA model trained on HowToVQA69M achieves excellent results, in particular for rare answers. Furthermore, we demonstrate our method to significantly outperform the state-of-the-art on 4 video question answering benchmarks.

## April 20th, 2021

**
Daniel Coggia
** (Cosmiq)

(no slides ☹)

Cryptographers develop and analyse algorithms providing confidentiality and integrity in the information era. The trust we have in those algorithms relies on the constant work of cryptographers trying to break them. On the example of symmetric key encryption, I will explain what security means and and how cryptographers think. I will also give hints on how their mindset applies to public key cryptosystems.

**
Gaetan Vignoud
** (Mamba)

(no slides ☹)

In neuroscience, learning and memory are usually associated to long-term changes of neuronal connectivity. Synaptic plasticity refers to the set of mechanisms driving the dynamics of neuronal connections, called synapses and represented by a scalar value, the synaptic weight. Such principles of flexible connectivity have been the basis of the development of deep neural networks, and the brain remains a great source of inspiration for data scientists. Spike-Timing Dependent Plasticity (STDP) is a biologically-based model representing the time evolution of the synaptic weight as a functional of the past spiking activity of adjacent neurons. We present a new, general, mathematical framework to study synaptic plasticity associated to different STDP rules. The system composed of two neurons connected by a single synapse is investigated and a stochastic process describing its dynamical behavior is presented and analyzed. We show that a large number of STDP rules from neuroscience and physics can be represented by this formalism. Several aspects of these models are discussed and compared to canonical models of computational neuroscience and the classical Hebbian learning used in Deep Learning

## March 16th, 2021

**
Rachid Riad
** (Coml)

Deep Learning models have become potential candidates for auditory neuroscience research, thanks to their recent successes on a variety of auditory tasks. Yet, these models often lack interpretability to fully understand the exact computations that have been performed. Here, we proposed a parametrized neural network layer, that computes specific spectro-temporal modulations based on Gabor kernels (Learnable STRFs) and that is fully interpretable. We evaluated the predictive capabilities of this layer on Speech Activity Detection, Speaker Verification, Urban Sound Classification and Zebra Finch Call Type Classification. We found out that models based on this learnable parametrized neural network are on par for all tasks with the different toplines, and obtain the best performance for Speech Activity Detection. As this layer is fully interpretable, we used quantitative measures to describe quantitatively the distribution of the learned spectro-temporal modulations. The filters adapted to each task and focused mostly modulation on low temporal and spectral modulations. The analyses show that the filters learned on human speech have similar spectro-temporal parameters as the ones measured directly in the Human Auditory Cortex. Finally, equipped with the Sinkhorn distance to compare the learned STRFs distributions, we observed that the tasks organized in a meaningful way: the Human vocalizations tasks closer to each other and the Bird task far away from the Human vocalizations and the Urban sounds.

## February 16th, 2021

**
Ulysse Marteau
** (Sierra)

Supervised learning aims to create a predictor of a certain phenomenon (weather given the meteorological data, identity a person given an image…), based on training data, which performs well on new (unseen) examples. In this, talk, I will present the field of supervised learning in greater detail. I will try to show three main aspects of the field : 1) the modelling aspect, i.e. what predictors we look at given a certain problem, 2) the statistical aspect, i.e. how the predictor we will compute with an algorithm will behave given a certain number of examples and finally 3) the optimisation aspect, i.e. how a computer effectively computes a predictor. If time permits, I will to shed light on my thesis with these three main components.

## January 19th, 2021

**
Benjamin Lipp
** (Prosecco)

Most of the important cryptographic protocols that are used for secure communication over the Internet are the result of large collaborations that led to the publication of an Internet standard. This talk will be about a new standard for a cryptographic building block called Hybrid Public Key Encryption (HPKE). I have been involved in its development for the past year. HPKE is used for a privacy feature in TLS, a protocol you might know for providing security on the web within the https protocol. HPKE will also be used by Messaging Layer Security (MLS), a new Internet standard for secure group messaging. Hybrid Public Key Encryption as such is a very old idea, and combines the strengths of asymmetric and symmetric encryption. In this talk, I will give an intuition of security properties we proved for this new formulation of an HPKE standard, and will lay out how we used the CryptoVerif proof assistant to write cryptographic proofs for them. I will also present an intrinsic security issue of this construction. At the end of the talk, I will give an outlook on my ongoing work about formally linking a mechanized cryptographic proof to a formally verified implementation.

## December 15th, 2020

**
Olga Gorynina
** (Matherials)

In many fields of material science and civil engineering one has to solve boundary value problems in composite materials. However, computing the discretized problems directly is a very challenging issue. One of a possible solution is to replace the heterogeneous material problem with a problem for a homogeneous material that behaves similarly. In mathematics, the classical theory that allows to extract «average» properties of very complicated media is called homogenization. The numerical problems that comes from the theory may be very expensive and difficult to solve. This is why it is necessary to develop alternative efficient numerical approaches for the evaluation of the homogenized properties of heterogeneous materials. In this talk, we present a mathematical formalization of the approach originally introduced in [Cottereau, R. (2013)] along with various improvements of the algorithms, in order to obtain a procedure as efficient as possible.

## Feb 18th, 2020

**
Denis Merigoux
** (PROSECCO)

Tax codes all around the world have to specify how much persons and companies are taxed depending on various characteristics: revenues, benefits, family situation, etc. Because of that, the text of the law contains an algorithm dealing with this computation, which is usually implemented under some form or another by tax agencies to compute automatically the amount of taxes owed. But how can we be sure that the implementation of the algorithm is correct with respect to the legislative text ? How can we make this implementation accessible to everybody in a way that allows interoperation with all the software that rely on tax computations ? Some answers to these questions can be found in the domain of formal methods, an established research community that has been securing real-world critical devices for decades, ranging from planes to cryptography. More specifically, the Verifisc project relies on the open-source implementation of the French tax code released by the Direction Générale des Finances Publiques to already provide usable software artifacts from a human-checked implementation of the law. We will also explore how programming language design can make the task of translating the law into an algorithm easier to do and check for correctness.

**
William Margheriti
** (MATHRISK)

The Martingale Optimal Transport (MOT) problem was introduced recently and finds its interest in mathematical finance, where its resolution yields model-free (e.g. Black-Scholes) price bounds for options for which the marginal distributions of their underlying assets are known. When those marginal distributions are finitely supported, the MOT problem amounts to linear programming, which we know how to solve. We then approximate those marginal distributions by finitely supported probability measures. On the one hand, those approximations have to maintain the existence of a martingale coupling, which is equivalent to have approximations in the convex order according to the celebrated Strassen theorem. On the other hand, the approximated solutions must converge to the solution of the original problem.

## May 5th, 2019

**
Marc Chevalier
** (ANTIQUE)

(no slides ☹)

Software failures can be very grievous, causing deaths, losing money, etc.. In such critical fields, it is important to prove that the code is correct. More precisely, we want to prove that it does not crash, it does what we expect, it does not leak sensitive information... Formal methods allow us to write algorithms that can prove semantic properties. We will focus on abstract interpretation which is a method that allows almost-automatic proofs and can be used on existing source code. We will see how abstract interpretation works, why it is awesome, but also why it isn't all mighty.

**
Louis Martin
** (ALMANACH)

(no slides ☹)

Text simplification has received increasing interest by the scientific community in recent years. This task aims at producing a simpler version of a source text that is both easier to read and to understand, thus improving the accessibility of text for people suffering from a range of disabilities such as aphasia or dyslexia, as well as for second language learners and people with low literacy. State of the art text simplification methods mostly use deep learning models and approach Text Simplification as a machine translation task. However some challenges remain open, such as the lack of large supervised datasets and the question of how to evaluate a Text Simplification system. My work focuses on how to evaluate Text Simplifications systems automatically and how to circumvent the lack of good quality data through self-supervision and controllable text simplification.

## March 19th, 2019

**
Sudarshan Shinde
** (OURAGAN)

(no slides ☹)

Integer factorization is a difficult problem and is widely used in cryptography. Naturally, from the point of view of cryptanalysis, it is important to consider the methods which allow us to factor integers. The elliptic curve method (ECM) is one such method which is widely used today. It was proposed in 1985 by Lenstra and improved later by Montgomery. The algorithm succeeds if the elliptic curves it uses have desirable smoothness properties. It is known that some curves are better than others when it comes to their performance in ECM. This raises the question of classifying these curves. In this joint work with Razvan Barbulescu, we present two approaches to find these ECM-friendly elliptic curves. Combined with other recent works, we obtain a complete list of 1525 such families of curves.

**
Roberto Medina
** (KOPERNIC)

(no slides ☹)

Hard real-time systems are found in many diverse areas including; autonomous vehicles, avionics, space systems, and robotics. These systems are categorized by their stringent time requirements (programs must respond within a specific time constraint) and their complexity (programs have different criticality levels, designating the level of assurance needed against failure). Nowadays, hard real-time systems are confronted to technological progress resulting in rapid increases in both software complexity and processing demands. To address these requirements, the adoption of multicore architectures is a necessity. My works focus on developing scheduling methods for hard real-time systems where data-dependent programs of different criticalities share a single multicore architecture. In this talk, we will present a generic scheduling algorithm that is efficient, in terms of resource usage (use as less cores as possible) and in terms of schedulability (schedule as many programs as possible).

## February 19th, 2019

**
Matías Bender
** (POLSYS)

(no slides ☹)

Solving polynomial systems is one of the oldest and most important problems in computational mathematics and has many applications in several domains of science and engineering. It is an intrinsically hard problem with complexity at least single exponential in the number of variables. We can solve these systems by computing Groebner bases for the ideals generated by the defining polynomials. My work focuses on exploiting the structure of these polynomials to compute Groebner basis faster than the worst case estimates. In this talk, we will review the theory of Groebner basis and some classical algorithms to compute them. Later, we will discuss a strategy to speed up the computations by taking advantage of the sparsity of the input polynomials, that is, the fact that only a few their monomials have non-zero coefficients.

**
Darius Mercadier
** (WHISPER)

(no slides ☹)

Bitslicing is a standard technique to improve the performance of certain cryptographic algorithms by exploiting data parallelism while making them, de facto, resilient to cache-timing attacks. To do so, bitsliced implementations turn lookup tables (such as S-boxes) into boolean functions, leading to a significant code blow-up and making it hard to write, debug and optimize manually. Usuba is a synchronous dataflow programming language we designed (based on an original idea from X. Leroy) to both specify and implement bitsliced algorithms. Usuba programs compile into bitsliced and optimized C codes, exploiting platform-specific SIMD extensions such as Intel's SSE and AVX, ARM's Neon or IBM's AltiVec.

## December 18th, 2018

**
Théo Zimmermann
** (PIR2)

(no slides ☹)

Any significant software project requires a bug tracker. In open source development, this plays an even bigger role. The impact of the bug reporting environment on the bug tracking activity is difficult to assess. In the work I will present, we have taken advantage of the switch from Bugzilla to GitHub of the bug tracker of Coq, a medium-sized open source project, to evaluate the impact that such a change can have. I will first present the switch itself, in particular the migration of 4900 preexisting bug reports using the GitHub issue import API; then the analysis of the data from before and after the migration. We have shown that the switch induces an increase in bug reporting, particularly from principal developers themselves, and more generally an increased engagement with the bug tracking platform, with more comments by developers and also more external commentators.

**
Kenji Maillard
** (PROSECCO)

(no slides ☹)

Verifying a program consist of proving that a given program meets its specification. Various frameworks have been studied to provide such specifications, for instance in Hoare logic programs are specified with pairs of pre/post-conditions. When dealing with programs mixing a variety of side-effects such as mutability, divergence, raising exceptions, or non-determinism, providing a framework for specifying those programs becomes challenging. Computational monads are a convenient algebraic gadget to uniformly represent a wide class of such side effects in programming languages. Inspired by Dijkstra's work, we want to use predicate transformers to verify monadic programs. We will explain how these predicate transformers can themselves be uniformly represented as a monad, leading to a powerful verification tool called Dijkstra monads used heavily in the F* programming language.

## Novembre 11th, 2018

**
Mathilde Regnault
** (ALMANACH)

(no slides ☹)

Many texts in Old French (9th-13th centuries) and Middle French (14th-15th centuries) are available but not all are annotated with syntactical information. We want to automatically extend an existing annotated corpus, which implies dealing with both diachronic and synchronic variations (i.e. through time and at a particular period). I will present solutions for part-of-speech tagging and syntactical analysis, especially metagrammars, a symbolic parsing system based on a modular description of language.

**
Giulia Terenzi
** (Paris Est - Università di Roma Tor Vergata)

(no slides ☹)

Why mathematics is so useful in finance? Essentially, to make good decisions in the face of uncertainty. In this talk we will speak about the financial problem of finding the fair price of an option. In order to do this, we need a model for the financial market. We will mention to the classical approach by Black and Scholes and, if time permits, to some generalizations like stochastic volatility models.

## October 16th, 2018

**
Mohamed Khemakhem
** (ALMANACH)

GROBID-Dictionaries is the first machine learning infrastructure for structuring digitised dictionaries into TEI-compliant resources. The system’s architecture relies on a cascading approach for information extraction from textual information in PDF documents. The implemented pluggable models in GROBID-Dictionaries have shown enough flexibility to be be applicable on a wide range of entry-based documents containing lexical or encyclopaedic information, from dictionaries to address directories. The usability has been also enhanced also to ease the setup and the training of the system. The tool is still under development and the feedback of new users with new samples is highly welcomed.

**
Xavier Bonnetain
** (SECRET)

(no slides ☹)

Cryptography aims at proposing systems that enable secure communications. The only way to ensure that a system is safe is by a continuous and meticulous study of the attacks against it. The possible attack models are diverse, with different capabilities and objectives for an attacker. Recently, a specific model has got a lot of attention: quantum attacks, where an attacker has access to a quantum computer. This kind of computers is extremely efficient at solving some specific problems and break multiple cryptographic constructions, for which we need a replacement. The search for suitables replacements is a very active area, as the NIST has issued a call for new quantum-resistant schemes to be standardized. Many candidates have been proposed, and are currently studied by the community. In this talk, we will present CSIDH, a recent key-exchange system that aims to be quantum-resistant, and the best known quantum key-recovery attack against it.

## Jun 19th, 2018

**
André Schrottenloher
** (SECRET)

Upcoming quantum computers are now widely acknowledged as a major threat to asymmetric (public-key) cryptography. The potential breakdown of RSA and ECC, respectively based on the classical intractability of integer factoring and discrete logarithms, using Shor's algorithm, has motivated the search for new quantum-secure algorithms. On the other side, symmetric cryptography was for a long time believed to be out of risk: while ciphers can be subject to Grover's algorithm, which accelerates quadratically exhaustive search, countering Grover seems as easy as doubling their key size, e.g, moving from AES-128 to AES-256. Moreover, the question of the future practicality of some quantum algorithms has been raised, due to massive quantum memory usage and / or little speedup. In this context, we study collision search. Collision resistance is one of the requirements of a good hash function, to be used e.g in digital signature schemes (but not only): it should be difficult to find two elements of equal hash. Classically, the optimal collision security is related to the length of the hashes by the so-called birthday paradox. Quantumly, collision search is one of these situations where the line between 'practical' and 'impractical' quantum algorithms becomes fuzzy, and until now, no algorithm with a 'reasonable' quantum memory had been able to beat the classical time for collision-finding. In this talk, we show that even a 'small' quantum computer can accelerate collision search. There are many implications of this speed up in symmetric cryptanalysis. For instance, we can show that not only the key size, but also the state size of primitives should be increased in order to ensure an equivalent post-quantum security.

**
Pierre-Loïk Rothe
** (MATHERIALS)

Multiscale materials are ubiquitous in many applications (thing e.g. of composite materials used in the aerospace industry). Simulating the response of these materials often leads to study elliptic equations in divergence form of the type -div(A grad(u))=f, where A represents the medium properties (for example electric conductivity, heat conductivity or elasticity tensor) and f is a given loading. For materials with complex micro-structure, the matrix A is highly oscillatory. Standard methods of resolution (such as finite elements methods) are too costly to use in this case. Indeed, to reach an acceptable accuracy, one would have to use a very fine mesh, at least to be able to represent the oscillations of the medium properties. To circumvent this issue, multi-scale techniques (such as the Multiscale Finite Element Method, MsFEM) are often used. The computations are performed in a two-stage procedure. First, during an offline stage, appropriate basis functions are computed as solutions to local fine scale problems (using a mesh finer than the oscillations). Second, during an online stage, information is passed on the macro-scale to solve the problem of interest on a coarse grid. In this talk, I will recall the MsFEM approach as introduced by Hou et al in 1998. Next, I will present a recent variant, namely the MsFEM method enriched with Legendre polynomials. Motivation, error estimates and numerical results will be discussed. This talk is based on joint works with U. Hetmaniuk (University of Washington), C. Le Bris (ENPC and MATHERIALS) and F. Legoll (ENPC and MATHERIALS).

## May 15th, 2018

## April 17th, 2018

**
Loic Grobol
** (ALMAnaCH)

Coreference resolution consists in identifying natural language expressions that refer to the same conceptual entity. During the last years, new methods, relying on deep learning techniques to improve the performance of classical heuristics, have revived the interest for a task that was until then seen as having reached a performance plateau. In this talk, I will describe these methods, and try explain how they can outperform the formerly state-of-the art models, but also why they will not suffice to solve this problem in the general case.

**
Benjamin Beurdouche
** (PROSECCO)

(no slides ☹)

Cryptography and security protocols are critical parts of the Internet ecosystem. Any flaws in cryptographic applications will have huge consequences for users. Unfortunately, many bugs were discovered in implementations. This presentation will detail the main security properties one should prove about cryptographic code and briefly describe HACL*: a High-Assurance Crypto Library, formally verified in F* and brought to you by the Prosecco team. Finally, since HACL* has been running inside Mozilla Firefox (the only browser you should use !) since November 2017, we’ll describe the integration process and the collaboration with Mozilla.

## March 20th, 2018

**
Gregoire Ferré
** (MATHERIALS)

Markov chains are used in many physical and statistical applications. They consist in a sequence of successive random configurations of a system, driven by a force and a random perturbation. A natural example is a physical system (like a set of atoms or a molecule) whose force tends to minimize its energy, while thermal effects result in a random agitation. Studying the long time behavior of such systems has a lot of applications, and I will present some numerical examples as well as the physical motivation for the current available results in the literature. An extension of Markov chains are the so-called branching processes, or population dynamics. In this framework, several replicas of a same random system are simulated in parallel, and a selection procedure is prescribed : as the replicas evolve, they have some probability of stopping, or to split in two replicas. A quantum physics interpretation is to represent electrons by brownian motions, and to select those that remain in the regions of lower energy. Understanding the long time stability of such processes is also crucial, but more intricate, and few results are available. The goal of the talk is to show with numerical applications the possible behaviors of such systems, and to motivate ongoing work on their stability in terms of physically motivated conditions. This project is realized in collaboration with G. Stoltz (Matherials) and Mathias Rousset (SIMSMART).

**
Noé de Rancourt
** (ENS Ulm/Paris Diderot)

Mathematics are a story of structures. Almost every branch of mathematics deals with its own kind of structures (for example groups, vector spaces, graphs, etc.), and a fundamental question is: how to classify these structures? Classifying structures, it roughly means associating with each structure a natural characteristic (which is often a number, or a sequence of numbers) such that two structures have the same characteristic if and only if they are, in some way, "the same". Having a classification of a kind of structures is a powerful tool to prove general theorems on these structures. I will begin this talk by giving some classical examples of classifications, and then we will turn to the following inconvenient question: what does happen when it seems impossible to classifiate some kind of structures? The usual answer of the mathematician to such a question is: "Does it seem that we cannot do someting? Then, let's *prove* that we cannot do it!" Therefore, I will present a general framework enabling to prove results of non-classifiabilty, the theory of the complexity of Borel equivalence relations. This theory can be seen as a... classification of classification!

## February 20th, 2018

**
Julien Roussel
** (MATHERIALS)

(no slides ☹)

In statistical physics the macroscopic properties of a material are given by the average of thermodynamic functions under the probability distribution characterizing the system. These averages write as integrals in a high dimensional space, which can not be computed using numerical quadratures. Computational molecular dynamics is a tool allowing to estimate these integrals using the ergodic properties of dynamics such as the Langevin equation. This stochastic process provides an estimator which satisfies a Central Limit Theorem under appropriate conditions. The concept of generator allows us to use the theory of partial differential equations to characterize the properties of this process, such as the asymptotic variance of the estimator. This variance can be reduced by making use of control variates, which allows to fasten the simulations.

**
Armaël Guéneau
** (Gallium)

Writing computer programs is a common activity for the working programmer and computer scientist. It is also common for programs to contain bugs: not only correctness bugs (where they return the wrong result), but also performance bugs, where they take a lot more time than expected to compute their result. In this talk, I will explain how we can establish machine-checked proofs that a given program is not only correct, but also runs in the expected time -- just like one would prove on paper that e.g. a binary search algorithm runs in time O(log n), or that a sorting algorithm runs in O(n log n).

## December 12th, 2017

**
Hussam Al Daas
** (ALPINES)

(no slides ☹)

Academic and real-life simulations require solving a large number of linear systems of equations with high orders. Different types of linear solvers exist: direct, iterative and hybrid. In this presentation I will talk about a class of widely-known iterative solvers, Krylov subspace methods. I will present briefly their basic strategies, convergence analysis for a special case and how to improve the convergence rate.Numerical experiments on real-life simulations of reservoirs will be presented.

**
Jan Kuřátko
** (Academy of Sciences of the Czech Republic)

(no slides ☹)

Complex technical systems that are modelled by ordinary differential equations may feature states that are considered unsafe. A simple example of such a system would be a thermostat in a room. When the thermostat is switched on the temperature rises and when it is switched off the temperature decreases. An unsafe state of the system could be for example a given temperature that is too high. Any solution of ordinary differential equations that originates from an initial state and reaches an unsafe state represents a flaw which is to be avoided in the design of the system. In the talk, we address how to compute these solutions based on numerical optimization.

## October 17th, 2017

**
Léo Perrin
** (SECRET)

(no slides ☹)

Electronic data exchange is often encrypted so as to prevent an eavesdropper from understanding the information she intercepts. This protection is provided by cryptographic algorithms such as block ciphers which use a secret key to turn the message into an encrypted counterpart. A common building block for block ciphers is an S-Box, a small function specified via its look-up table which provides, for example, a high non-linearity. Not all algorithm designers specify the design process which leads to the S-Box they use. In this case, it is necessary to reverse-engineer this design process using only the look-up table itself. In this talk, I will introduce methods allowing this and show two main applications. The first is the discovery of a hidden structure in the latest standard block cipher from Russia. The second is a new class of mathematical functions obtained by applying these methods to the result of a theorem.

**
Florian Omnès
** (UPMC, LJLL)

(no slides ☹)

In this presentation, I will present the past and present trends in numerical shape optimization, with an emphasis on fluid problems. In the first part of the presentation, I will make a brief summary of the many existing methods in shape/topology optimization for fluids. I will present only three classes of methods. Each correspond to specific problems with a given set of physical hypotheses. In the second section, I will present some of the early works in aerodynamic profile optimization. These works usually combine a potential flow model with parametric optimization, using only a small number of parameters, typically less than 10. I will then discuss the works of Borvall-Petersson (2003) and the extension of Evgrafof (2005) on porous media optimization for Stokes flows. This approch allows to establish existence results for the shape optimization problem, as well as numerical methods to approach one solution. Lastly, I will present the shape optimization using the 'shape derivative' framework. This framework is useful both in devising numerical methods and establishing necessary conditions on the optimal design for a variety of equations. I will give a few numerical results obtained using this method.

## June 13th, 2017

**
Thomas Boiveau
** (SERENA, ENPC (CERMICS))

(no slides ☹)

In computational mechanics, the imposition of boundary conditions is a major challenge, it is therefore crucial to use the right numerical tools to model in the best possible way a given boundary value problem. In this talk, we will introduce the Nitsche's method that is a very useful tool to impose boundary conditions in the framework of finite element. Nitsche's method is a way to impose weakly Dirichlet boundary conditions, this weak imposition allows us to extend the method to more complex problems such that, cut finite element, domain decomposition or fluid structure interaction.

**
Salah Eddine Saidi
** (AOSTE)

(no slides ☹)

The recent advancement in merging different technologies and engineering domains has led to the emergence of the field of Cyber-Physical Systems (CPS). Because of the heterogeneity of the involved components (multi-physics, sensors, actuators, embedded computers), CPS may feature very complex architectures and designs, ever raising the need for time, cost, and effort-effective approaches for building robust and reliable systems. Co-simulation has proven successful in responding to this need, and is therefore increasingly considered to be an indisputable step in design verification and validation. For complex CPS, experts of different engineering disciplines may be involved in the design process by developing models for different parts of the system. In a later stage, the developed models need to be integrated into one simulation environment and coupled together in order to perform simulation at the system level. The Functional Mock-up Interface (FMI) was developed with the goal of facilitating the co-simulation and the exchange of numerical models. One major challenge of co-simulation is the computing performance. In fact, integrating heterogeneous models into one environment usually results in a complex and heavy co-simulation which increases the demand of processing power. In this talk, I will give an overview of co-simulation of CPS, highlight some challenges for the parallelization of cosimulations, and present some results about the parallelization of closed-source co-simulations on multi-core processors.

## May 16th, 2017

**
Lélio Brun
** (PARKAS)

The compilation of block diagram languages like Lustre or Scade often use an intermediary language between the source data-flow code and the target imperative code. We recently implemented and verified a translation function in Coq from a simple intermediary language towards the Compcert-accepted Clight language. The intermediary language uses an idealized hierarchical memory representation, whereas Clight uses an intricate memory model rather close of the machine. Our proof of correctness must consequently tackle alignment, aliasing and padding. We use separation logic predicates developed in Compcert to solve those issues.

**
Georgios Bouloukakis
** (MiMove)

(no slides ☹)

Systems deployed in mobile environments are typically characterized by intermittent connectivity and asynchronous sending/reception of data. To create effective mobile systems for such environments, it is essential to guarantee acceptable levels of timeliness between sending and receiving mobile users. To provide QoS guarantees in different application scenarios and contexts, it is necessary to model the system performance by incorporating the intermittent connectivity. Queueing Network Models (QNMs) offer a simple modeling environment, which can be used to represent various application scenarios, and provide accurate analytical solutions for performance metrics, such as system response time. In my talk, I will present you an analytical solution regarding the end-to-end response time between users sending and receiving data by modeling their intermittent connectivity with QNMs. We utilize the publish/subscribe middleware as the underlying communication infrastructure for mobile users. To represent the user's connections/disconnections, we model and solve analytically an ON/OFF queueing system by applying a mean value approach. Finally, we validate our model using simulations with real-world workload traces.

## April 18th, 2017

**
Jan Papez
** (ALPINES)

Many applications lead to solving (possibly very large) algebraic systems and this is often the most time-consuming part of the overall computation. In the talk we focus on the issues related to solution of algebraic problems stemming from numerical partial differential equations. In particular, we discuss the spatial distribution of the algebraic error and its comparison with the errors of other origins, the global and local error estimators, and stopping criteria for iterative algebraic solvers. The issues will be, on purpose, demonstrated using very simple model problems and the talk aims at providing the basic ideas and presenting the challenges, which are of general importance, related not only to numerical PDEs. We will do our best to avoid technical details and formulas, and to present the matter in a form comprehensible to a vast audience.

**
Cristian Maxim
** (AOSTE)

(no slides ☹)

The increased number of systems consisting of multiple interacting components imposes the evolution of timing analyses towards methods able to estimate the behavior of an entire system by aggregating timings bounds of its components. In the real-time domain these bounds are added to the worst case execution times (WCET) and the way they are obtained had always been subject to confusion and debate. In our work, we are using the probabilistic methods to determine the pWCET of a component starting from the execution time traces produced by the system. The correctness of our results is critical for the scheduling stage. In my presentation I will be explaining the notions of real-time systems, timing analysis, pWCET estimation and their use in daily life, concentrating on the avionics industry and its applications.

## March 14th, 2017

**
Jean-Karim Zinzindohoué
** (Prosecco)

(no slides ☹)

Implementing flawless, efficient and side-channel free cryptography is a notoriously hard problem. To tackle this issue we rely on F*, a functional ML-like proof oriented programming language. This language lets us mix concrete code and logical specifications about some code properties which it verifies statically at compile time. Using this technology, we present HACL*, a high assurance cryptographic library implemented and verified in F*. With formal methods we prove that the code is safe, functionally correct and free of certain classes of side-channels. The code is compiled to C through a proven safe extraction mechanism, yielding verified code which performance is on par with state-of-the-art C libraries.

**
Matteo Aletti
** (REO)

(no slides ☹)

The retina is a unique tissue in the human body since it is possible to observe non-invasively the microcirculation. In these tissues blood flow rate is regulated to satisfy specific metabolic needs or to accomodate for abrubt changes in the systemic pressure. The mechanism is based on fluid-structure interaction: arterioles vessels walls contain smooth muscle cells that contract or dilate to adapt the vessel diameter therefore changing its hidraulic resistance. A 3D model for this phenomenon will be presented with an application to a network of retinal vessels obtained from medical images.

## February 14th, 2017

**
Lara Trussardi
** (REO)

(no slides ☹)

In this talk I will present two different models for describing some socio-economic scenarios. The first model describes the dynamics of agents in a large market depending on the estimated asset value of a product and the rationality of the agents using a kinetic inhomogeneous Boltzmann-type equation. The second model describes the influence of knowledge and wealth in a society where the agents interact with the others through binary interactions. For both models numerical simulations has been implemented: for the first model the numerical simulations highlight the importance of the reliability of public information in the formation of bubbles and crashes. The use of Bollinger bands in the simulations shows how herding may lead to strong trends with low volatility of the asset prices, but eventually also to abrupt corrections. In the latter the kinetic code shows the possibility of cluster formation, using certain specific threshold and the formation of some interesting steady states.

**
Guillaume Terradot
** (ANTIQUE)

(no slides ☹)

Cells derive resources from their environments and use them to fuel the biosynthetic processes that determine cell growth. Depending on how responsive the biosynthetic processes are to the availability of intracellular resources, cells can build up different levels of resource storage. Here we use a recent mathematical model of the coarse-grained mechanisms that drive cellular growth to investigate the effects of cellular resource storage on growth. We show that, on the one hand, there is a cost associated with high levels of storage resulting from the loss of stored resources due to dilution. We further show that, on the other hand, high levels of storage can benefit cells in variable environments by increasing biomass production during transitions from one medium to another. Our results thus suggest that cells may face trade-offs in their maintenance of resource storage based on the frequency of environmental change.

## January 17th, 2017

**
Héctor Martínez Alonso
** (ALPAGE)

Multitask learning (MTL) is a general technique to train classifiers, where different datasets or 'tasks' are used together for training using a common loss function, in a way similar to many approaches to joint learning. MTL in neural networks has recently shown promising results for different tasks and architectures in natural language processing. Ideally, a MTL setup allows a network to learn from heterogeneous data sources and improve the quality of predictions by benefitting from the relations between the respective label inventories of the different tasks. In this work, we benchmark the response of a recent, successful architecture for linguistic sequence prediction (part of speech, named entities, etc) with regards to the choice of auxiliary task. Chiefly, we provide an empirical study on what data sources lend themselves to MTL, and provide an information-theoretical account of their properties.

**
Rafael Angarita
** (MiMove)

(no slides ☹)

Computer-mediated communication can be defined as any form of human communication achieved through computer technology. From its beginnings, it has been shaping the way humans interact with each other, and it has influenced many areas of society. There exist a plethora of communication services enabling computer-mediated social communication (e.g., Skype, Facebook Messenger, Telegram, WhatsApp, Twitter, Slack, etc.). Based on personal preferences, users may prefer a communication service rather than another. As a result, users sharing same interests may not be able to interact since they are using incompatible technologies. To tackle this interoperability barrier, we propose the Social Communication Bus, a middleware solution targeted to enable the interaction between heterogeneous communication services. More precisely, the contribution of this paper is threefold: (i), we propose a survey of the various forms of computer-mediated social communication, and we make an analogy with the computing communication paradigms; (ii), we revisit the service bus paradigm in the context of computer-mediated social interactions. We build on top of the extit{eXtensible Service Bus} (XSB), that supports interoperability across computing interaction paradigms, to provide a solution for computer-mediated social communication interoperability; and (iii), we present Social-MQ, an implementation of the Social Communication Bus that has been integrated into the AppCivist platform for participatory democracy.

## December 13th, 2016

**
Christèle Etchegaray
** (Orsay)

(no slides ☹)

Cell migration is a fundamental process involved in physiological processes such as embryogenesis, immune response, or tumor metastasis. Cells are active systems, and their internal activity involves a wide range of elements interacting on different time and space scales. Understanding the resulting behaviour is therefore a challenge. Our aim is therefore to construct a minimal trajectory model describing the multi-scale interactions involved in the process. In this talk, I will present a deterministic model of cell migration on a plane surface, where the cell domain is a non deformable disc. I will explain that structures involved in migration were shown to be also responsible for the reinforcement of cell polarisation at the molecular scale, leading to a self-reinforcement of the dynamics. Then, numerical simulations will show that several behaviours can be reached by the model. This work is in collaboration with Nicolas Meunier (MAP5, Univ. Paris Descartes) and Raphaël Voituriez (LJP, LPTMC, CNRS, UPMC).

**
Olivier Tissot
** (ALPINES)

Sparse linear systems of the form Ax=b where A is a n x n matrix arise in a lot of numerical problems. When n becomes very large it becomes necessary to use a supercomputer. However it is a very challenging issue to solve them in an efficient way on such machines. There exists a lot of methods for solving linear systems and the focus will be on Krylov methods and more precisely on the Conjugate Gradient: the method of choice when the matrix is symmetric positive definite. We will see how to redesign it in order to increase performances on parallel machines.

## November 15th, 2016

**
Markus Köppel
** (SERENA)

The talk is concerned with the dynamics of two-phase flow in a two-dimensional heterogeneous porous medium with randomly located interfaces. In the deterministic case discontinuities of the flux function occur due to the change of the material parameters. Based on the capillarity-free fractional flow formulation a hybrid Stochastic Galerkin method (HSG) for heterogeneous two-phase flow is presented. The classical polynomial chaos expansion is extended by a multi-element discretization. This yields a (weakly) coupled, deterministic and hyperbolic-elliptic system which allows for efficient parallelization.

**
Pierre Monmarché
** (Matherials (CERMICS))

For high-dimensional problems (protein folding, big data statistics, optimal path finding...), the use of deterministic algorithms may be prohibited by their numerical cost. In some cases, it is more efficient to try to solve the problem 'at random', more precisely to let the computer explore the configuration space according to a well-chosen random dynamic. In the basic versions of these so-called Monte-Carlo algorithms, the random process is memoryless : it never learns anything from what it has already seen, which is far from efficient. On the other hand, keeping a high-dimensional memory has a (possibly prohibitive) numerical cost. What can we do about this ?

## October 18th, 2016

**
Cécile Taing
** (Laboratoire Jacques-Louis Lions, UPMC)

(no slides ☹)

The interactions between individuals with a phenotypic variability usually lead to com- petition and selection of the fittest individuals. The goals of this work are to describe the asymptotic behaviour of the solution to a renewal type equation with a phenotypical structure and then to derive properties that illustrate the selection phenomena. As a first step, we begin with a simpler model by only considering the competition and disregard- ing the effects of mutations. The analysis is lead by the study of an eigenvalue problem, with eigenelements depending on the structuring variables of the model . Then we tackle the problem with mutations, which leads to the study of a constrained Hamilton-Jacobi equation.

**
Alexandre This
** (REO, Philips Research)

(no slides ☹)

The mitral valve can be subject to several pathologies resulting in blood regurgitation from the ventricle to the atrium. The proximal isovelocity surface area (PISA) method is widely used to assess the severity of the regurgitation. However, it relies on several assumptions on both fluid motion and morphology. We designed numerical simulations to assess the influence of breaking those assumptions on the PISA model. A simplified heart geometry was created and three different defects were introduced in the mitral valve. Four heart cycles were simulated using a fluid/structure interaction method. The simulated results were then compared with the estimation provided by the PISA method. In the simulation presented, the mitral defect shapes strongly affects the shape of the isovelocity surface and this results in important estimation error of clinically relevant quantities. Moreover, it was found that the aortic outflow influenced the regurgitation. These observations would justify the use of patient-specific models for clinicalevaluations. Further work includes the use of a more realistic patient geometry and the use of patient specific boundary conditions

## June 21st, 2016

**
Camilla Fiorini
** (LM UVSQ)

This seminar will be about sensitivity analysis for nonlinear hyperbolic equations. First we will introduce the general framework and talk about sensitivity analysis in wider terms; then, we will focus on the case of nonlinear hyperbolic equations, whose discontinuous solutions cannot be treated with the standard techniques.

**
Garvita Bajaj
** (MiMove)

(no slides ☹)

The last decade has seen a drastic increase in the number of smartphones being used by the people. These smartphones have sensing and computation capabilities which can be used to sense the physical world (activity, location, temperature, etc.) and the network world (other devices in network range). These sensing capabilities offer opportunities to develop new applications in several domains such as transportation, health monitoring, and environmental monitoring. Relying on the sensing capabilities of a large number of (un)known devices to measure phenomenon of common interest (e.g., pollution levels, traffic levels, etc.) is referred to as mobile crowdsensing (MCS). The developers of these mobile crowdsensing applications range from naive developers to academicians. To facilitate the development of these applications, several frameworks aimed at tackling specific challenges, have been proposed in the literature. However, to the best of knowledge, the problem of intelligent sensing task allocation to the participants of MCS applications, based on their resource levels, has not been tackled so far. In this task, I will introduce the concept of mobile crowdsensing, and the need for frameworks for development of MCS applications. I will begin with the motivation behind intelligent task allocation followed by our approach with preliminary results.

## May 17th, 2016

**
Katerina Tzompanaki
** (Telecom Paristech)

(no slides ☹)

With the increasing amount of available data and data transformations, typically specified by queries over the data, the need to understand them also increases.``Why are these medicine books in my sales report?'’ or ``Why are there not any database books?'' To answer such Why and Why-Not questions that typically arise both in the scientific and commercial world, data provenance plays a crucial role. While Why questions have been widely addressed in classical databases, this is surprisingly not the case for Why-Not questions that only recently attracted the interest of the research community.

**
Yi Yin
** (MAMBA)

(no slides ☹)

Tumor cellularity is an important tissue microstructural feature, which is useful for cancer diagnosis and cell number related treatment such as targeted cancer therapies. Diffusion-weighted MRI (DWI) that indicates the rates of water diffusion in tissue is used in medical imaging as it is thought to provide an indirect measure of tumor cellularity. Histopathological examination of tissues reveals the tissue microstructure, but is usually available as a small sample or after resection. How to quantitatively estimate the total tumor cell load is still a challenging work. In this study, we proposed a pipeline for tumor cell load estimation based on image processing techniques. The main pipeline steps include: 1) adaptive cell nuclei segmentation; 2) model based 3D cell density estimation; 3) correlation identification between the DW-MRI diffusion coefficient (D value) and the underlying tumor cellularity; 4) cell number calculation based on tumor D values and the identified correlation between diffusion and cellularity. The proposed cell segmentation algorithm performs detection with a very high accuracy (0.95) on selected tumor tissue samples. For a specific patient with NSCLC, the estimated total cell number of the tumor (6.5 cm in diameter) is 3.64×1010. The proposed method translates tumor DWI signals into cell density and quantifies the whole tumor cell load. The developed method can be applied to other cancer types.

## April 19th, 2016

**
Mario Pereira
** (Toccata (Inria Saclay))

Why3 is a platform for deductive program verification. It provides a rich language for specification and programming, called WhyML, and relies on external theorem provers, both automated and interactive, to discharge verification conditions. Why3 comes with a standard library of logical theories (integer and real arithmetic, Boolean operations, sets and maps, etc.) and basic programming data structures (arrays, queues, hash tables, etc.). A user can write WhyML programs directly and get correct-by-construction OCaml programs through an automated extraction mechanism. WhyML is also used as an intermediate language for the verification of C, Java, or Ada programs.

**
Athmane Bakhta
** (CERMICS)

Many electronic devices such us computers, cellphones, photovoltaic cells, medical devices,...are based on crystals. It is very important to understand the microscopic behavior of such materials. Indeed, the electronic structure of a crystalline material allows for example to determine if it will be insulating or conducting, what is his color, or if it will have some magnetic behavior. But, how to modelize such materials? and how the electronic structure is it calculated? The answer lays in two key-words: Schrödinger equation and Bloch-Floquet transform.

## March 15th, 2016

**
Damien Garreau
** (SIERRA)

(no slides ☹)

The kernel change-point detection algorithm, introduced by S. Arlot, A. Celisse and Z. Harchaoui (http://arxiv.org/abs/1202.3878). The change-points in a piecewise constant signal with values in an arbitrary set are selected via a penalized model selection procedure. In this talk, Damien will show how the correct number and position of the change-points are recovered under suitable hypothesis on the data and the kernel.

**
Renaud Dessalles
** (RAP)

The functional molecules of the cell, the proteins, are produced from the genetic information through an intermediate molecule, the mRNA. This chain of production is highly stochastic (as the result random encounter of macromolecules, thermal excitation etc.) and produces a protein distribution with noticeable variability. In bacteria, for some gene, feedback mechanisms may exist in the protein production. One simplest of them consists in a protein that can bind to the promoter of its own gene, preventing the gene's transcription of mRNA, and therefore the production of the same protein. A stochastic model that represents this particular feedback, called autogenous feedback, is presented. After presenting the model in its biological context, we investigate what features can this autogenous feedback have compared to the models without feedback in terms of distribution and temporal behaviour.

## February 16th, 2016

**
Mikel Landajuela
** (REO)

Mechanical systems involving the interaction of a deformable structure with an internal or surrounding incompressible fluid flow appear in a wide variety of scientific and engineering fields: from the aeroelasticity of sailing boats and parachutes to the biomechanics of blood and air flows. In this talk, I will discuss several novel techniques for the numerical simulation of this type of problems. The basic principle is that, for computational efficiency, the numerical methods must introduce a significant degree of splitting between the fluid and solid time-steppings. Special attention will be paid to problems involving large structural deflections, with solids that might come into contact or that might break up. A series of numerical tests, including comparisons with experimental data, illustrates the performance of the different methods proposed.

**
Jacques-Henri Jourdan
** (GALLIUM)

Maybe you do not know it, but most of you use some sort of compiler (or maybe an interpreter?) in your everyday research activities. But, hold on! What's a compiler? What is it made of? How does it works? What does it means for a compiler to be correct? What are the open questions of this area? In this talk, we will do an overview of all the different components of a compiler, from parsing to assembly code generation, without forgetting to speak about type-checking and code optimization. We will get insight of what it means to prove correct a compiler, such as CompCert, developped in my team, Gallium.

## November 17th, 2015

**
Aiko Dinale
** (RITS)

(no slides ☹)

FURBOT (Freight Urban RoBOTic) is a European project developing an improved system for urban freight delivery. Its centerpiece is a new fully-electrical semi-autonomous vehicle realizing freight transportation. Although travel between the beginning and end points relies on the human driver, the procedures related to loading and unloading are almost entirely automatized. These include the precise docking of the vehicle at the freight boxes prepared for pick-up, as well as at the unloading location. During the presentation, particular attention will be paid to the vehicle control system and the procedures related to the autonomous loading and unloading.

**
Margriet Palm
** (MAMBA)

(no slides ☹)

When a primary tumor spreads to other organs, changes of survival decrease radically because current treatments are rarely able to cure metastatic cancer. If it were possible to block the spreading of primary tumor, metastatic cancer would no longer be a problem. Unfortunately, our current understanding of tumor metastasis is insufficient to block this process. To improve our understanding of tumor metastasis, we built an agent-based model of tumor invasion, which is one of the first steps of metastasis formation. In this model, one or more cells migrate that mechanically and chemically interact with their environment and thereby deforming it such that the cell can migrate.

## October 20th, 2015

**
Chloé Audebert
** (REO)

(no slides ☹)

Liver is an organ which has the ability to regenerate. In order to treat some pathologies, a partial ablation of the liver (partial hepatectomy) is required. Even if the liver regenerates after the surgery, sometimes the new liver is less performant which put in danger patient health. The liver blood perfusion is complex and, after a partial hepatectomy, the remaining portion of the liver experiences important changes in pressure and flow. These changes seem to be the cause for the non-functional liver regeneration. The mathematical modeling could help to better understand and predict the liver hemodynamics after partial hepatectomy. The model needs to be simple enough to be parametrized but complex enough to represent the important effects of the hepatectomy.

**
Aurora Armiento
** (BANG)

(no slides ☹)

Protein aggregation is one of the main causes of neurodegenerative diseases called amyloid diseases (among which are Alzheimer, Huntingtons, Parkinson etc). Moving from Lifshitz-Slyozov equation, we define a model for protein polymerisation. We couple the model to the observation of some moment of the cluster concentration function. The goal is to estimate kinetic coefficients and/or cluster concentration. The estimation is performed by data assimilation methods.

## June 16th, 2015

**
Pauline Traynard
** (LIFEWARE)

Chronobiology, the field that examines biological rhythms, has many promising applications for human health. For example, understanding the links between the cell cycle and the circadian clock in mammalian cells could ultimately help designing cancer chronotherapies. In her talk, Pauline will present systems biology approaches to this problem. In particular she will show how Temporal Logic can be used to calibrate and study the oscillatory dynamics of a bidirectional coupling model of the cell cycle and the circadian clock.

**
Clément Poncelet
** (MUTANT)

The slides — also available in an alternate format

Interactive Music Systems (IMS) are involved in live performances with human musicians and react in realtime to audio signals and unpredictable incoming events. Some of them, as the software Antescofo, are Score-Based IMS: they emit outputs according to a pre-specified timed scenario called a mixed music score. Such systems are used during public concerts and the complexity of the environment (i.e. the human musicians) prevents composers from testing these systems manually. How can we test formally a score-based Interactive Music System? Clément will present us a solution to this problem.

## May 19th, 2015

**
Guillaume Seguin
** (WILLOW)

Digital visual contents are ubiquitous. What can we do with such amounts of visual data? In this talk, we've looked at examples of computer vision applications and approaches on how to model and tackle some of the underlying tasks.

**
Saliha Lallali
** (SMIS)

The slides — also available in an alternate format

Saliha works on a new embedded search engine designed for smart objects. Such devices are generally equipped with extremely low RAM and large Flash storage capacity. To tackle these conflicting hardware constraints, conventional search engines privilege either insertion or query scalability but cannot meet both requirements at the same time. Moreover, very few solutions support document deletions and updates in this context. She introduced three design principles, namely WriteOnce Partitioning, Linear Pipelining and Background Linear Merging, and shown how they can be combined to produce an embedded search engine reconciling high insert/delete/update rate and query scalability.

## April 21st, 2015

**
Guillaume Claret
** (PI.R2)

We presented the Coq system by a demo session. Coq is a software used to check mathematical proofs or to write safe programs. More and more people see the formal verification as a required step to make sure that (complex) proofs are correct. The powerful Coq axioms (different than the sets theory) can also open alternative and elegant ways to formalize problems.

**
Rémi Lajugie
** (SIERRA)

Learning tasks involving structured outputs such as graphs, sequences are ubiquitous. Dealing with them require specific algorithms. In this talk, we focused on cases where the learning task involve a temporal structure (such as time warping or change-point detection). We presented some applications to music information retrieval and computer vision.

## March 17th, 2015

**
Joachim Cohen
** (QUANTIC)

The slides — also available in an alternate format

A quantum computer could solve problems inaccessible to classical ones. However, such a machine would suffer from the fragility of the quantum systems due to their interaction with the environment. A solution to this major issue is provided by quantum error correcting codes, but their efficient implementation remains an important experimental challenge. Joachim told us how a carefully engineered interaction of the system with its environment circumvents the need for feedback control, thus realizing autonomous quantum error correction.

**
Luca Nenna
** (MOKAPLAN)

Luca gave a general overview of the optimal transport problem and its extension to the multi-marginal framework (numerical simulations helped us to understand this nice theory!). Then he introduced an application of optimal transportation to quantum mechanics in order to understand the electronic configuration of atoms and molecules.

## February 17th, 2015

**
Pierre-Marie Pédrot
** (Pi.r2)

Pierre-Marie Pédrot spoke about the Curry Howard correspondence : this concept, used by computer science logicians, introduce a deep link between the notions of program and the notion of proofs. He will explain why, when you are proving a theorem, you are likely to be also writing a program, and, conversely, when you are writing a program, you are not far from proving a theorem. This idea is at the very core of the Coq proof assistant, which let you both write program and prove theorems at the same time.

**
Arnaud Lionnet
** (MATHRISK)

Arnaud has presented the basic problem of pricing and hedging derivative contracts in finance. Derivatives are contracts that pay an amount of money dependent of what another financial asset does. This problem is really at the heart of what quantitative analysts do in banks. for this, he has introduced backward stochastic differential equations, which provide a natural language to cast the pricing and hedging problem.

## Juanuary 20th, 2015

**
Jean Thorey
** (CLIME)

(no slides ☹)

Jean will explain how to forecast global solar radiation from numerical weather predictions.

**
Zuqi Tang
** (POMDAPI)

Zuqi told us about different kinds of errors in mesh-based numerical methods, and explaind us how to tweak algorithm in order to balance them.

## December 16th, 2014

**
Géraldine Cellière
** (MAMBA)

The slides — also available in an alternate format

The body is a tightly regulated machinery and a lot of compounds are maintained within narrow concentration ranges for a good functioning. One of the main regulating organs is the liver. But with medicines and alcohol intake, you make life tough for your liver! What happens when the liver is damaged for example in the case of a paracetamol overdose or cirrhosis? Because measuring processes in vivo is challenging, we use multi-scale mathematical modeling strategies to investigate the liver’s adaptation to acute injury focusing on ammonia detoxification. Based on existing data and literature knowledge, these models examine the plausibility of different hypotheses to guide towards new informative experiments.

**
Lucile Megret
** (MYCENAE)

Many natural phenomena can be described by slow-fast systems, where a fast transition between a state of equilibrium and relaxation oscillation is modelised with a canard explosion. Lucile told about a category of Liénard's slow-fast systems and present a first approach to investigate the possibility of bifurcations displaying a dramatic change in the phase portrait in a very small change of a parameter. After a presentation of some theoretical and numerical tool, she provided an example of such a very rapid loss of stability on a specific example using classical numerical continuation.

## November 18th, 2014

**
Victorien Menier
** (GAMMA3)

Victorien told us about Mesh Adaptation, Fluid Dynamics and Aeronautics. He explained us what are adapted meshes, and why they are very useful to accuratly and efficiently simulate non-viscous flows using Euler equations. Then, he explained what difficulties he faces to extend these techniques to Navier-Stokes equations, that take into account viscosity.

**
Nicholas Tarabelloni
** (REO)

Nicholas told about how to make a computer able to analyze an electrocardiogram. His aims are both to be able to computationally distinguish pathological and physiological states, but also to assess the validity of synthetic signals.

## October 21st, 2014

**
Emilie Kaufmann
** (DYOGENE)

Emilie spoke about sequential resource allocation. This mathematical concept is widely used, from recommender systems and on-line advertisement to clinical trials. In this framework, an agent is facing an unknown environment, consisting of several options with random outcomes. Emilie presented strategies such an agent can have to decide which option he wants to try, in an adaptive way (i.e. based on the results of previous trials). His strategy should achieve a trade-off between exploration (trying options with a lot of uncertainty) and exploitation (focusing on options he thinks are best). In her talk Emilie presented a statistical model called the 'stochastic multi-armed bandit' (in reference to the optimal choice of a slot machine -also called "one-armed bandit"- in a casino). She introduced a notion of optimality and presented strategies based on several statistical tools.

**
Guillaume Trehard
** (RITS)

Guillaume told us about "Ego-localization" for autonomous vehicles. As some of you have seen in the "salon de l'auto" in Paris, autonomous riving is a fast growing problematic. Yet, it is still a real challenge. Guillaume presented solutions of a particularly crucial challenge: the precise localization of the vehicle, in its environment and in a global map. Taking the most of information fusion and data filtering methods, his solutions provide under meter precision while being reasonably expensive in terms of money, data and developing time.

## June 17th, 2014

**
Elisa Schenone
** (REO)

Elisa simulates the electrical behavior of the heart with complex equations. Since classical numerical techniques such as the Finite Element Method (FEM) can require a long time to be solved, she presented two techniques of Reduced-Order Method (ROM) used to decrease the dimension of the basis, i.e. the space where equations are solved: the Proper Orthogonal Decomposition (POD) and a new method based on Approximated Lax Pairs (ALP).

**
Chantal Keller
** (PROSECCO)

Chantal presented F*, a programming language by Inria and Microsoft with precise types designed for program verification. She explained the main ideas behind the design of such a language, and the current questions that arise, e.g. to deal with higher-order programs, side effects, non-termination...

## May 20th, 2014

**
Pierre Merdrignac
** (RITS)

Pierre presented his research in the team RITS, a multidisciplinary project carrying out R&D activities for Intelligent Transportation Systems (ITS). This domain aims at improving road transportation in terms of safety, efficiency, comfort and also to minimize nuisances. RITS develop multidisciplinary ideas from robotics, communications, and traffic modeling. In his talk, he introduced the applicability and performances of laser-based pedestrian detection and vehicle to pedestrian communication for safety of vulnerable road users.

**
Cuong To
** (SMIS)

Cuong works on the critical issue of personal data which usually ends-up on central servers. Cuong showed us how decentralized data storage architectures can help individuals keep full control of their data. He presented secure distributed querying protocols and showed how to secure their execution in the presence of honest-but-curious attackers, without revealing any sensitive information to central servers.

## April 15th, 2014

**
Pietro Gori
** (ARAMIS)

Pietro talked about statistical methods for the analysis of neuronal connectivity in patients with Gilles de la Tourette syndrome (GTS). He analyses structural and diffusion images of the brain with a novel atlas construction procedure to capture the invariants of both GTS patients and control subjects, and identify atypical anatomical configurations of neural circuits in patients.

**
Wafaâ Haffaf
** (MAMBA)

Protein aggregation leading to the formation of amyloid fibrils is involved in several neurodegenerative diseases such as prion diseases. Wafaa presented two models to clarify how these fibrils are able to incorporate additional units. The initial model, adapted from the Becker-Döring system that considers that all fibrils react similarly but is not able to reproduce the observed in vitro behaviour. It comes then a second model which involves an additional compartment of fibrils unable to incorporate more prion units and correctly simulates the first experimental steps for prion aggregation.

## March 18th, 2014

**
Caterina Urban
** (ANTIQUE)

Caterina introduced us to abstract interpretation, a framework able to automatically prove some properties on programs. In particular, she presented abstract domains for proving termination of programs. The domains automatically synthesize piecewise-defined ranking functions and infer sufficient conditions for program termination. With her team Caterina has implemented a prototype static analyzer for proving termination of programs which is competitive with the state of the art and performs well on a wide variety of benchmarks.

**
Soledad Fernandez Garcia
** (MYCENAE)

Soledad presented her PhD work on Melnikov theory for planar hybrid systems. At the present time, the study of piecewise-smooth dynamical systems is enjoying a great popularity due to its capability to model mechanical devices, electronic circuits, control systems and biological problems, inter alia. In hybrid systems, each zone of differentiability is separated by a straight line. When an orbit reaches the separation line, then a reset map applies before entering the orbit in the other zone. Soledad showed us how to use Melnikov functions to analyze the existence of limit cycles. The results have been applied to the study of invariant cones in 3D piecewise linear systems.

## February 13th, 2014

**
Fajwel Fogel
** (SIERRA)

Fajwel works on convex relaxations of non-convex problems. Convex optimisation deals with finding the minimum of convex functions, for instance to fit the parameters of a model etc. Although many real-world problems are not convex, it is sometimes possible to write a convex relaxation of the original problem and thus compute an approximate solution to it. Fajwel gave examples of this technique in X-ray molecular imaging, and DNA sequencing.

**
Roxana Dumitrescu
** (MATHRISK)

(no slides ☹)

Roxana presented the reflected backward stochastic differential equations (RBSDEs), driven by the brownian motion and the Poisson random measure. RBSDEs represent a very useful mathematical tool in financial mathematics. Roxana gave some applications in the case of risk measures and pricing of American Options.

## January 21st, 2014

**
Stephanie Lindsey
** (REO)

(no slides ☹)

Stephanie presented her experimental work that involves altering flow patterns early on in development and studying their effects on subsequent cardiac formation. She then explained the benefits of coupling these experiments with computational fluid dynamics. Her work aims to determine the cause of some of the more severe congenital heart defects that present themselves clinically.

**
Henning Sulzbach
** (RAP)

The probabilistic analysis of algorithms is a research area on the edge between mathematics and computer science. One of its founders, Philippe Flajolet, spent his academic career at Rocquencourt. Henning started with a short introduction to the field. Then, he presented one of the main results of his thesis (Frankfurt, 2012) concerning the complexity of certain search operations in multidimensional data structures. He provided insight to the underlying mathematical ideas without giving technical details.

## December 17th, 2013

**
Adrien Guatto
** (PARKAS)

The slides — also available in an alternate format

Adrien works on synchronous programming and told us about stream equations and nested loops. Equations over finite data streams are a classic formalism used during the design phase of embedded systems. Compilers for specialized streaming languages are able to translate such equations to low-level source code written in C, and have met a resounding success in industrial practice. He presented his ongoing work aimed at generating better code from these streaming languages.

**
Valentin Suder
** (SECRET)

Valentin works on symmetric cryptography and boolean functions. Because of their link with digital logic, boolean functions are fundamental in computer science. In this talk Valentin focused on cryptographic applications. We saw how Boolean functions can help to model cryptographic objects, to emphasize security criterions and to analyze block ciphers. We also saw why designing boolean functions that have the best resistance against some attacks is a difficult task.

## November 19th, 2013

**
Simon Marmorat
** (POEMS)

Simon showed us how wave propagation can be disturbed by small heterogeneities in the media. He explained why simulating the propagation of waves inside such medium is challenging for classical numerical tools and how its complexity and computational costs can be reduced by introducing an 'easier' approximate model.

**
Andres Gonzalez
** (CONTRAINTES)

Andres talked about experimental techniques in biology that allow tracking of gene expression in single cells over time. So far, few attempts have been made to fully exploit this data for the modeling of biological networks and cell population dynamics. Andres uses it to develop models that are able to predict gene expression at the single-cell and population level.

## October 22nd, 2013

**
Jessica Oakes
** (REO)

(no slides ☹)

Jessica talked about aerosolized therapeutics and treatments of lung diseases, e.g. lung cancer or asthma. She presented her experimental and numerical studies on rat emphysema, a chronic obstructive pulmonary disease caused by inhaling toxic particles and gases over a long period of time.

**
Rémi Varloot
** (DYOGENE)

Rémi began with a presentation of Markov chains and of their applications. He then gave details concerning the study of their evolution, and commented on the difficulty of creating random samples of the system in its equilibrium. He presented two algorithms that achieve this, "Monte Carlo Markov chains" and "coupling from the past", and concluded by introducing a variant of the latter.

## September 17th, 2013

**
Antoine Delignat-Lavaud
** (PROSECCO)

Antoine explained how he works to prevent various kinds of web attacks and trojan programs. He presented tools and models he developed to verify web applications and explained their implementation against a realistic web attacker.

**
Ali Assaf
** (DEDUCTEAM)

Ali presented proof systems tools which help to write better software. Many proof systems exist; Ali showed how translating them to a single unified proof framework can improve their interoperability.

## June 18th, 2013

**
Jonathan Protzenko
** (GALLIUM)

Jonathan gave us a controversial talk about why we need type-checking in programming languages. Drawing inspiration from an actual story, he told us the story of a person who wants to start programming but encounters a compiler error. Moving on, he introduced the audience to the life of a researcher in the field of programming languages, the theorems that they have to prove to earn a living, and the kind of results that they want to achieve.

**
Emanuele Leoncini
** (RAP)

The slides — also available in an alternate format

Emanuele told us about his exciting research work. Emanuele, when he's not busy running or swimming, works on modeling protein production inside cells. This requires the use of ∭ integrals, Poisson point processes, and advanced mathematics and biology. In his talk we learned all about bacteria and how they're being studied!

## May 21st, 2013

**
Mathieu Leconte
** (DYOGEN)

Mathieu presented two applications of generalized matching in random graphs. The first application deals with the optimization of storage in a distributed manner, in order to optimize content delivery networks, such as YouTube and more generally video-on-demand systems. The second application analyses the performance of multiple-choice hashing techniques.

**
Faten Nabli
** (CONTRAINTES)

Faten explored the structural properties of Petri nets as a means to provide information about the evolution of a biochemical system and its dynamics, especially when kinetic data are missing, thus making simulations impossible. In particular, the structural properties of siphons and traps will be enumerated using a boolean model, while place and transition invariants will be analyzed using constraint programming.

## April 23rd, 2013

**
Senanayak Karri
** (SIERRA)

Submodular functions are a useful class of discrete set functions in various areas such as economics, computer vision, machine learning, computer networks, operations research etc. Optimization of submodular functions is a problem that lies on the boundary of efficiently computable(P) and not efficiently computable(NP) class of problems. Sesh gave an overview of different algorithms proposed to perform optimization efficiently.

**
Joshué Perez
** (IMARA)

(no slides ☹)

In the last years, mass-produced vehicle implementations have been done in the field of intelligent Transportation System (ITS). The Advanced Driver Assistance Systems (ADAS), intelligent infrastructures and autonomous driving maneuvers have significantly contributed to the implementation of intelligent systems on the road and in urban areas. Thanks to research done by many groups and projects is possible to find safer and more comfortable vehicles. Some examples are: Antilock Brake System (ABS), Cruise control (CC), Automatic parking and Electronic Stability Control (ESC), among others. At this time, it is not a utopia to think that, in a close future, autonomous vehicle will coexist with other conventional vehicles, interacting with them. In this work, different control systems for autonomous vehicles have been developed, both individual and cooperative maneuvers in different urban and highway scenarios. Using previous works in the lateral and longitudinal controller, different experiments have validated the modular control architecture, independent of the vehicle and the scenario used, described in this paper. To realize these experiments, different platforms have been used. Electric and gasoline-propelled vehicles for urban environment, that is: straight and curve segments, blocked roads, overtakings, commucations between infrastructure and vehicles and high speed experiments have been used. Furthermore, this control system is easy to tune, based on fuzzy logic techniques and human knowledge, and it has been extended to different maneuvers.

## March 26th, 2013

**
François Durand
** (GANG)

François gave an introduction to voting systems. Voting systems are useful not only for elections, but also in computer systems such as routing, where entities have to make choices for the best route possible. After trying to understand what is a "good" voting system, and whether such a good voting system exists, we saw how we can try to manipulate the results of an election, and what we can do about it.

**
David Benoît
** (MICMAC)

David told us about the mathematical analysis of models for a non-Newtonian fluid. Numerous problems, especially in civil and environmental engineering (rivers, avalanches, mudslides) require simulations of non-Newtonian fluids. Compared to Newtonian fluids, which can be modeled with the Navier-Stokes equations, the non-Newtonian nature brings new nonlinearities that complicate the modeling and numerical simulation of such fluids.

## February 26th, 2013

**
Cédric Pasteur
** (PARKAS)

Cédric gave us an introduction to time and concurrency in programming languages, more specifically, synchronous functional programming languages, such as ReactiveML. The talk featured several amazing demos, including one with a famous plumber from a well-known video game.

**
Nick Jagiella
** (BANG)

(no slides ☹)

Nick, after successfully defending his PhD, summed it up for us: we learned how cancer works, in particular, what kind of mechanisms come into play when modeling cancerous cell growth. Nick connected together both a macroscopic point of view and a microscopic perspective, and showed how successful modeling can help give better predictions for patients.

## January 15th, 2013

**
Gregory Arbia
** (REO)

Congenital heart defects are the leading cause of birth defects, affecting 1% of in live births. For many of them, several surgeries are needed. Modeling of blood flow becomes a tool for understanding a disease, or to simulate different surgical options. The magnetic resonance imaging can construct a three-dimensional geometry and flow measurements. On the other hand, using a catheter, blood pressure measurements are done. Then it is to simulate the flow of blood in the discretized field and to model the effect of the structure downstream. At this end a coupling between the partial differential equations of Navier-Stokes equations and ordinary differential equation is performed at each output of three-dimensional domain. Issues of this study are numerical and clinical. On the one hand it is necessary to deal with problems related to coupling instabilities, and on the other hand clinicians want a reliable representation of the flow of blood in the area of interest. The results are presented for five patients who underwent a so called Norwood procedure, connecting the systemic network to the arterial network close to the heart.

**
Jaime Gaspar
** (PI.R2)

Proof interpretations are tools in mathematical logic with many applications: for example, consistency results, unprovability results, and extraction of computational content from proofs, to name a few. In this talk we are going to introduce proof interpretations and to present examples of their applications. We keep the talk short, simple and sweet.

## December 14th, 2012

**
Francesco Santini
** (CONTRAINTES)

Semiring-based Soft Constraints provide a mean to associate a score to the satisfaction of classical crisp constraints. In this way, it becomes possible to represent users' preferences, to solve over-constrained problems, and to optimize weighted problems in general. In this talk, Francesco introduced the general formal framework, and an application to Abstract Argumentation Frameworks.

**
Benjamin Aymard
** (SISYPHE)

An ovarian follicle is a spherical shelter for oocyte, in the ovaries. The selection process ends by ovulation, in the best case, bust most of the time by atresia (degeneration). We study the problem of the selection process of the ovarian follicles. At micro scale, the model is a 2D hyperbolic system of conservation laws of size 20 times 20 with discontinuous fluxes on interfaces. At macro scale, we have a closed loop control. The coupling between micro and macro is done by the moments of the solution. In order to obtain numerical simulations of such a model we use parallelization and high order finite volume schemes. The strategy in the computations is to use an analogy with the biology.

## November 20th, 2012

**
Thibaut Balabonski
** (GALLIUM)

Thibaut explained to us how to save time in a computation by cheating the system in order to build simple optimal evaluation strategies in any programming language.

**
Saverio Smaldone
** (REO)

Saverio told us how to split the complexity the cardiovascular system by coupling the blood flow in the different cardiovascular compartments.

## October 16th, 2012

**
Steven Gay
** (CONTRAINTES)

Steven told us how he uses contraints programming to efficiently solve problems on graphs that model biological reactions.

**
Anne-Céline Boulanger
** (BANG)

Anne-Céline explained to us how to optimize microalgae growth, a complex problem with many parameters -- algae could serve as a potential source of biofuel in the future.

## September 18th, 2012

**
Gabriel Scherer
** (GALLIUM)

Gabriel gave a very engaged talk about how real-world languages are poorly designed, and how the programming language community can help prevent design mistakes. This talk bashed various languages such as Python, Javascript, PHP...

**
Cécile Mailler
** (LMV)

Cécile introduced us to the topic she worked on during her master's internship, namely, random boolean binary trees. She gave a very accessible introduction to the main result about these trees, and gave proof sketches using two different techniques.

## June 26th, 2012

**
Filippo Visco Comandini
** (SISYPHE)

Filippo gave us an introduction to electrical networks, and how to detect faults in them. Like many other research topics, this is motivated by industrial concerns, especially in airplanes, where diagnosing faulty components in crucial.

**
Aliénor Burel
** (POEMS)

The slides — also available in an alternate format

Aliénor told us about wave propagation in soft materials. Modeling such phenomena involves complex mathematical equations, and a lot of numerical analysis.

## May 15th, 2012

**
Mathieu Feuillet
** (RAP)

Mathieu works on network modeling. He uses probabilities and queuing networks; there are various applications, and the one he chose to present is the study of congestion in the Internet network. His worrying talk was entitled « Is the Internet doomed to collapse? ».

**
Estelle Mbinky
** (GAMMA)

Estelle works on mesh generation algorithms. Problems include mesh refinement, and applications include various modeling situations, such as planes. She told us about error control for anisotropic physics problems.

## April 10th, 2012

**
Jérémie Lumbroso
** (ALGORITHMS)

Jérémie gave us an introduction to probabilistic algorithms. In particular, he showed us how probabilistic counting algorithms allow one to watch a flow of data, and count items in real-time, while using less bits than required and still giving an accurate estimate. Applications include sequence counting in DNA, filtering out duplicates in search results...

**
Alexandre Imperiale
** (MACS)

Alexandre works on heart modeling. He told us about his favorite pig, and how he modeled his heart, then made a barbecue out of it. There are various problems involved with heart modeling: one of them is data assimilation, that is, finding parameters so that the simulation matches observations as close as possible.

## March 20th, 2012

**
Luna Dimitrio
** (BANG)

Luna works on p53, a super-powerful protein. When the cell is shocked, p53 is able to suspend the cellular division while the cell heals; if the cell is damaged beyond repair, p53 will kill the cell. Luna explained the different feedback mechanisms, and how, introducing a spatial aspect in her differential equations, she was able to accurately modify the oscillations in the concentration of p53.

**
Pawan Goyal
** (SANSKRIT)

Pawan works on text analysis: he told us how, using various techniques, he was able to improve the quality of information retrieval in various documents, and improve text summarization as well. He described the various challenges in extracting relevant extracts from documents.

## February 14th, 2012

**
Amel Bennaceur
** (ARLES)

Amel's work is about making different, apparently incompatible systems talk together. This is particularly useful in emergency response situations, where you have to make different entities work together. By reasoning formally on what different systems expect, she is able to dynamically generate glue that ties these different systems together.

**
Jannis Uhlendorf
** (CONTRAINTES)

Jannis' work is about proteins, and how to engineer them so that they behave differently. By using a control-command system, he is able to control the development of these proteins.

## January 17th, 2012

**
Karim Drifi
** (CLIME)

Karim works in climate modeling; he tries to predict the velocity of fluids, such as the sea, using imagery data, e.g. temperature measurements obtained from a satellite. These models require tremendous computing power, and his job is to invent new techniques so that the modeling becomes tractable.

**
Yanli Guo
** (SMIS)

Yanli works with embedded databases in smart cards. The very specific constraints of such chips require sophisticated algorithms to provide cryptographic guarantees while still coping with the limited capabilities of the chips, and this is what Yanli's works are about.

## December 20th, 2011

**
Émilie Coupechoux
** (TREC)

Émilie told us how she uses graphs to model epidemics propagation; how the presence of clusters, or communities, in graphs, allows one to better understand how epidemics propagate; and how this relates to real-life situations.

**
Cristóbal Bertoglio
** (REO)

Cristóbal introduced us to fluid-structure interactions, and how we can use them to measure artery stifness: this has many application, especially in diagnosing and measuring congenital heart diseases.

## November 15th, 2011

**
Christina Boura
** (SECRET)

Christina introduced the audience to cryptographic hash functions, their properties, and what makes a hash function a good one.

**
William Weens
** (BANG)

The slides — also available in an alternate format

Wiliam Weens talked about liver cancer, and how to use an agent-based model to reason about cell interactions, which may lead to a better understand of the various cancer types.

## October 18th, 2011

**
Thierry Martinez
** (CONTRAINTES)

Thierry Martinez discussed constraint logic programming, and illustrated his talk with the Sudoku game as an example.

**
Daniele Trimarchi
** (MACS)

(no slides ☹)

Daniele Trimarchi gave a talk about to model difficult physical phenoma such as wind in yacht sails, using advanced modelisation techniques.

## September 20th, 2011

**
Marc Mezzarobba
** (ALGORITHMS)

Marc gave a talk on computer-assisted symbolic computations on mathematical functions.

**
Alexandre Pilkiewicz
** (GALLIUM)

Alexandre gave a talk about formally verifying the compilation process using the CompCert verified compiler, and why certified compilation is important in life.