# BARBIER-CHEBBAH Alex

École Normale Supérieure - PSL

alex.barbier-chebbah [at] pasteur.fr

## Short bio

PHD at Sorbonne University

## Research project

Multi-Armed Bandit model.

## Short abstract

My main research interests are random walks theory and sequential learning, focusing on their connections to decision-making task in complex environment. We combine statistical physics, Bayesian inference, information theory and numerical simulation to both probe learning procedure in insect behavior, but also to design lightweight algorithms able to mimic such procedures. In particular, relying on infotaxis methods, we develop a new class of multi-armed bandit (MAB) algorithms to achieve optimal performance at all timescales of the sequential learning procedure.

There are two major challenges to develop the theory of modern machine learning algorithms. The first is related to the high-dimensionality of the data, the huge number of data and the very large number of parameters to estimate. In order to study physical systems formed by a very large number of degrees of freedom (particles, spins,..) statistical physicists have developed a large body of tools, methods, an ideas to study and characterize high-dimensional probability distributions and stochastic processes. One of the major research lines in the PRAIRIE groups working at the interface of statistical physics and machine learning consists in leveraging those statistical physics methods to analyze the performance and the training dynamics of machine learning algorithms in the high-dimensional limit.

The second challenge is to characterize the structure of the data, and the interplay with the architecture of neural networks, which allows to circumvent the curse of dimensionality and obtain a good performance. PRAIRIE groups combine methods and ideas developed in computer science (wavelet theory) and in physics (renormalization group) to tackle this major problem. A third line of research consists in applying modern machine learning methods to statistical physics problems, in particular to accelerate dynamical simulations and to characterize complex physical systems such as glasses and amorphous solids.

# VESTERGAARD Christian

Computational Neuroscience

christian.vestergaard [at] pasteur.fr

## Short bio

Christian Vestergaard is a CNRS researcher in the Decision and Bayesian Computation lab at the Pasteur Institute. He holds a PhD in theoretical physics and biophysics from the Technical University of Denmark (2012).

## Topics of interest

Network modeling, neuroscience, graph learning, statistical physics.

## Project in Prairie

Christian Vestergaard’s research focuses on linking the complex topology of the neural network that make up an animal’s brain to how it computes and generates behavior. He develops statistical and computational methods drawing inspiration from graph theory, statistical physics, information theory, and machine learning. This project aims at providing new approaches to artificial neural network design and optimization.

## Quote

Universal coding theorems show that a multitude of different neural architectures can be used to represent any function. Thus, the intricate architecture of biological neural networks probably determine not what they can learn, but rather how they encode information in order to provide good inductive biases that enable robust and efficient learning. Focusing on small animals, such as *Drosophila* larva, whose neural wiring has been mapped at full resolution and whose neurons can be individually controlled in freely behaving animals, will allow us to link the structures of link neural microcircuits to their functions. This will help uncover how biological neural networks differ from artificial neural networks and may provide inspiration for more efficient deep learning architectures.

## Team

##

BARBIER-CHEBBAH Alex

Postdoctoral researcher

# BARRÉ Chloé

chloe.barre [at] pasteur.fr

## Short bio

PhD, LPTMC (Laboratoire de Physique Théorique de la Matière Condensée), Sorbonne University, Paris

## Research project

Bayesian induction of the behavior of the larva.

## Short abstract

Making
decisions is a fundamental feature of animal behavior. Nevertheless, there
remains a large knowledge gap in linking neural architecture and behavioral
response. To bridge this gap, targeting individual neurons and having a simple
read-out of their activity is crucial, and Drosophila larvae are ideal
organisms for such an approach. My work is part of a larger project to explore
the relationship between neural network dynamics and decision making in
Drosophila larvae. I use Bayesian induction techniques and physical modeling to
understand this relationship.

By
combining video measurement experiments of larval behavior with advances in
modern optogenetics that allow the activation/inactivation of individual
neurons, a database of millions of larvae responding to the activation of
single neurons has been constructed. Although a machine learning approach that
projects larval videos into complex behavioral dictionaries has been developed,
some images remain ambiguous and the corresponding behavior is therefore poorly
detected. To improve behavior detection we describe the shape of the larva
using insights from solids mechanics. Using this physical model, we perform a
Bayesian induction to find parameters that describe the behavior of the larvae
in a more robust way.

Once the behaviors are properly detected and quantified, we want to detect all possible responses and modulations induced by the activation or inactivation of a neuron. We have written a simplified model that describes the dynamics and sequences of behaviors. With Bayesian inference I learn the parameters of my model and with a generative model and theses parameters I can recreate virtual larvae. These virtual larvae made it possible to separate neural responses between those provoking simple and immediate actions from those generating complex behaviors. It is thus possible to group neurons in terms of response.

By combining the techniques of biologists with probabilistic analysis techniques (including Bayesian inference), we can identify behavioral changes due to the activation/inactivation of neurons and thus will allow us to infer causal relationships between neural activity and behavioral patterns, and uncover how behavior emerges from activity in the connectome.

# PELLEGRINI Franco

L’Ecole normale supérieure - PSL

franco.pellegr [at] gmail.com

## Short bio

PhD in Condensed Matter physics from SISSA, Trieste, Italy

## Research Project

Theory of neural network learning dynamics.

## Short abstract

We want to describe the general mechanism that makes neural networks so effective in solving real life problems. We aim to use methods from statistical physics to build a model describing the dynamics of neural network parameters during training. We hope this insight will allow us to develop new training algorithms leading to improved networks.

# OZAWA Misaki

L’Ecole normale supérieure - PSL

misaki.ozawa2045 [at] gmail.com

## Short bio

PhD University of Tsukuba

## Research project

Multiscale physics and wavelet transform.

## Short abstract

In physics, multiscale phenomena are captured by the renormalization group. Wavelet transform is useful in analyzing multiscale behaviors. We investigate the relation between the renormalization group and the wavelet transform. Then we wish to obtain insight into how features are extracted hierarchically in neural networks.

# BENICHOU Alexis

alexis.benichou [at] pasteur.fr

## Short bio

Master in Physics of Complex Systems at Université de Paris

## Thesis title

Structural Basis of Neural Computation and Behavior.

## Short abstract

Behavior and decision-making are determined by physical processes taking place in the complex environment of the nervous system. During the last 3-5 years, breakthroughs in experimental techniques have made it possible to map the wiring diagram (the physical connectome) of the full central nervous systems of simple model organisms at the level of single synapses and portions of the brain for higher animals. Moreover, we can now control the activity of individual neurons in live animals through optogenetic stimulation and record the resulting behavior. Together this offers the occasion to reverse engineer the physical basis of behavior. The current project aims to leverage simultaneous access to the wiring of the central nervous system of Drosophila melanogaster larva and to existing data from large-scale behavior screens, which record the effect of individual activation or silencing of the majority of the larva’s neurons, in order to address the following question: What constraints do the structure of the connectome impose on an organism’s capability to process information and encode behavior? Specifically, it will combine modern methods from machine learning and network science to investigate how the circuitry of the Drosophila larva’s nervous system influences the way it encodes behavior.

# BIROLI Giulio

giulio.biroli [at] ens.fr

## Short bio

Professor of theoretical physics at ENS Paris (2018-). Research director at IPhT CEA (2002-2018). Associate professor at the Ecole Polytechnique (2010-2015). DIrector of the ICFP Master (2019-), director of the Beg Rohu Summer School (2008-). Editor in Chief of Journal of Statistical Physics. PI of the Simons collaboration «Cracking The Glass Problem», ERC Consolidator Grant 2011, Prix d’Aumale 2018, Young Scientist award in statistical physics 2007.

## Topics of interest

Physics, machine learning, complex systems

## Project in Prairie

Giulio Biroli will create and promote a strong synergetic activity combining physics and machine learning. He will develop a physically based theoretical approach to deep learning and apply machine learning methods to analyse complex dynamics of physical systems. He will organize a reference annual summer school-workshop on machine learning and physics.

## Quote

Methodological and conceptual progresses obtained in the last decades in statistical physics and in probability theory, in particular on the analysis of high-dimensional random landscapes, on high-dimensional out of equilibrium dynamics, and on disordered systems, provide theoretical frameworks to tackle difficult challenges in AI. At the same time, AI offers new ways of studying physical systems. This is just the right time to build up on these concurrent opportunities, and develop a strong synergy combining physics and machine learning.

## Team

##

OZAWA Misaki

Postdoctoral researcher

PhD University of Tsukuba

D’ASCOLI Stéphane

PhD student

Master in Theoretical Physics, ENS Paris

BACHTIS Dimitrios

Postdoctoral researcher

PhD, Swansea University