SCHAIPP Fabian

Postdoctoral researcher

Inria

fabian.schaipp [at] inria.fr

Short bio

PhD, Technical University of Munich

Research project

Robust and adaptive training algorithms.

Short abstract

Training machine learning models amounts to solving stochastic optimization problems at scale. My research focuses on understanding and improving optimization algorithms, particularly with the aim to design robust and easily tunable methods.

MARTIN Simon

PhD student

Inria

simon.martin [at] inria.fr

Short bio

  • Master in Probability and Statistics (Université d’Orsay)
  • Master in Soft Matter Physics (ICFP)

Thesis title

Energy Landscapes and Dynamics of Deep Neural Networks.

Short abstract

Despite the breakthrough of machine learning in the past decades, the theory behind neural networks and their learning dynamics is still poorly understood compared to their practical achievement in various domains. One promising approach is to rely on the strong analogy between the behaviors of physical disordered systems and deep neural networks. The goal of this PhD is to combine applied mathematics and statistical physics tools in order to solve optimization problems in high dimension. More precisely, the student will focus on the learning dynamics of deep neural networks by using methods inspired from spin glasses theory.

ZARHALI Othmane

PhD student

Université Paris Dauphine – PSL | CNRS

othmane.zarhali [at] dauphine.eu

Short bio

  • Ecole des Mines de Nancy, Engineering degree, major: Applied mathematics
  • Ecole Polytechnique / Sorbonne Université, Msc Probabilités et finance (ex DEA El KAROUI)

Thesis title

Multifractal and rough volatility processes in statistical finance. Link with market microstructure.

Short abstract

Log S-fBm models, which reconciles rough volatility models and multifractal volatility models, has laid the foundations for estimation methods common to both universes. Its applications and challenges are the focus of this thesis.

DUBOIS-TAINE Benjamin

PhD student

Inria

benjamin.paul-dubois-taine [at] inria.fr

Short Bio

  • Bachelor in Computer Science – McGill University
  • Master 2 – Université Paris-Saclay

Thesis title

Continuous Optimization and applications.

Short abstract

We study continuous optimization problems under different forms, from smooth first-order methods to relaxation of combinatorial problems, with a strong focus on theoretical guarantees. Applications on satellite imagery are also explored.

BONNAIRE Tony

Machine learning and statistical physics

École Normale Supérieure - PSL

tony.bonnaire [at] ens.fr

Short bio

PhD in Astrophysics at Université Paris-Saclay

Research project

Machine learning and statistical physics.

Short abstract

My current research focuses on understanding the dynamics of simple neural networks and particularly how gradient descent can achieve good generalization, especially when initialized randomly, in high-dimensional, rough and non-convex landscapes. For this purpose, I use methods coming from theoretical physics, and more precisely the statistical physics of disordered systems, to obtain asymptotic success conditions for these methods but also to study the topological properties of the random landscapes.

Machine learning and optimization

Machine learning is the core algorithmic component behind recent successes in artificial intelligence, relying on training models with vast amounts of data. All major subareas of machine learning are represented within the Prairie Institute, including supervised, unsupervised, and reinforcement learning. Our research extends to new algorithm development and the analysis of their theoretical guarantees, as well as representational issues specific to various data types like text and images.

Within machine learning, optimization plays a critical role, as most modern formulations culminate in optimization problems. Our research prioritizes convex optimization algorithms, with particular emphasis on stochastic and distributed algorithms. We also delve into non-convex optimization, pertinent to large-scale models such as neural networks, as well as challenges where the curse of dimensionality is inevitable.

BRIANCEAU Camille

Engineer

ICM Institute

camille.brianceau [at] icm-institute.org

Short bio

  • Master degree (Diplôme d’ingénieur) at Institut d’Informatique d’Auvergne (ISIMA)
  • Master degree in imaging and technology for medicine (Université Clermont Auvergne)

Research project

ClinicaDL

Short abstract

ClinicaDL is an open-source software for deep learning processing on neuro-imaging data. My works consists in extending this software with new features and standard deep learning tools of the community, and providing PhD students and researchers with support.

MAIER Jakob

PhD student

Inria

jakob.maier {at} inria.fr

Short bio

Bachelor of Science in Mathematics: Technical University of Munich

Master of Science in Statistics and Probability: Ecole Polytechnique (M1) and Université Paris Saclay (M2)

Thesis topic

Efficient algorithms for information extraction on graphs.

Short abstract

We examine algorithms that extract information from a given graph which can be issued from an application or from random sampling. The information is typically obtained in the form of statistical, algebraic, or combinatorial invariants and serves in several applications: detection of a latent geometric structure, alignment of two graphs, or community identification. The main objective is to obtain theoretical guarantees for the functioning of the algorithms while assuring that they can be efficiently executed.

BEUGNOT Gaspard

PhD student

Inria

gaspard.beugnot [at] inria.fr

Short bio

Ecole Polytechnique – MVA

Thesis topic

Non-convex optimization and learning theory with kernel methods.

Short abstract

Kernel methods are a versatile tool to study the statistical properties of a vast category of learning algorithm. On one hand, we aim at understanding the generalisation properties of neural network. This enable to design new and more efficient learning routines. On the other, we tackle non-convex optimisation problems through kernel sum-of-squares. 

NOUIRA Asma

PhD student

Mines ParisTech

asma.nouira [at] mines-paristech.fr

Short bio

  • Master degree, National Engineering School of Sousse, Tunisia
  • Engineer degree, National Engineering School of Sousse, Tunisia

Thesis topic

Stable feature selection in multi-locus Genome Wide Association Studies.

Short abstract

Our main goal is to provide a stable framework in Genome Wide Association Studies using Machine Learning, essentially feature selection models to deal with high-dimensional data. Many challenges lay ahead such as: genetic population stratification, linkage disequilibrium patterns clustering, the stability of the selection and the computational complexity. We aim to solve these issues by developing efficient algorithms applied to real data in case control studies such as  breast cancer disease.

GOYENS Florentin

Postdoctoral researcher

Dauphine-PSL

florentin.goyens [at] dauphine.psl.eu

Short bio

PhD in mathematics at the University of Oxford

Research topic

Continuous optimization.

Short abstract

Most of my research is related to continuous nonconvex optimization. I am particularly interested in optimization problems with constraints, such as smooth manifolds; and second-order methods. I consider applications in numerical analysis and machine learning.

VINCENT Louis

PhD Student

Université de Paris / Inria / Inserm / Implicity (CIFRE thesis)

louis.vincent [at] implicity.fr

Short bio

Master 2 – Mathématiques, Vision & Apprentissage (ENS Paris-Saclay),
Master 2 – Statistiques (Sorbonne Université – Campus Pierre et Marie Curie)

Thesis title

Longitudinal data encoding applied to medical decision support in telecardiology.

Short abstract

In telecardiology as in many other fields of modern medicine, we have at our disposal large amounts of data explaining the evolution of a patient. These data can often be missing or corrupted, and data from several sources can sometimes be of different nature, which makes their exploitation difficult.
My goal is to develop a model capable of synthesizing different types of temporal data via auto-encoders to infer the state of a patient. In the context of tele-cardiology, this could for instance allow us to predict deteriorations of a patient’s health status, and thus anticipate and prevent more serious complications.

MISCHENKO Konstantin

Postdoctoral researcher

Inria

konsta.mish [at] gmail.com

Short bio

PhD from KAUST, supervised by Peter Richtarik

Research topic

Optimization for machine learning.

Short abstract

I design new optimization algorithms for machine learning and study their convergence. I am particularly interested in stochastic methods, adaptivity, distributed training, and federated learning.

COHEN-SOLAL Quentin

Postdoctoral researcher

Université Paris Dauphine-PSL

quentin.cohen-solal [at] dauphine.psl.eu

Short bio

PhD at the University of Caen

Research topic

Reinforcement learning in games.

Short abstract

This postdoc focuses on the study and improvement of learning and planning algorithms in games.

EVEN Mathieu

PhD student

Inria / ENS

mathieu.even [at] inria.fr

Short bio

M2 Orsay/Paris-Saclay

Thesis title

On Federated and Distributed Learning Problems.

Short abstract

We study Federated and Distributed learning problems, with a strong incentive on theoretical guarantees. A (possibly large) number of agents aim at making predictions (supervised or unsupervised learning). To what extent can they benefit from collaborating with each other, depending on the learning problem and the communication constraints ?

SIMSEKLI Umut

Inria

umut.simsekli [at] inria.fr

Short bio

I am a Research Faculty (Chargé de Recherche) at INRIA – SIERRA team and École Normale Supérieure, Computer Science Department. I received my PhD from Boğaziçi University, Computer Engineering Department in 2015. During 2016-2020, I was an associate professor at Télécom Paris, Institut Polytechnique de Paris and I spent one year as a visiting faculty at the University of Oxford, Department of Statistics during 2019-2020.

Topics of interest

Theory of Deep Learning, Markov Chain Monte Carlo.

Project in Prairie

My main research area is machine learning, including theory, algorithms, and applications. My ultimate goal has been to develop theory that closely follows practice and leads to methods that have impact on real problems. In particular, I have been interested in (i) analyzing deep learning methods from the lens of heavy-tailed dynamical systems theory, and (ii) implicit generative modeling algorithms by using tools from computational optimal transport.

Quote

Machine learning is a fascinating field, which continuously generates exciting and quite nontrivial theoretical, practical, and even societal/ethical questions. Attacking the questions from all these aspects simultaneously, the partial solutions offered by machine learning researchers have only posed additional questions and revealed many interesting and sometimes surprising “phenomena”. With such puzzling observations, I believe the machine learning of today should be treated as a natural science, rather than an engineering science, with many mysteries to be discovered and with many potential outcomes that might outreach its apparent scope.

Team

Postdoctoral researcher

SCHAIPP Fabian
SCHAIPP Fabian
Postdoctoral researcher

ROYER Clément

Dauphine - PSL

clement.royer [at] dauphine.psl.eu

Short bio

Clément Royer is an associate professor of computer science at Université Paris Dauphine-PSL and a researcher in the MILES team at LAMSADE. From 2016 to 2019, he was a postdoctoral research associate at the Wisconsin Institute of Discovery, University of Wisconsin-Madison, USA. He received his Ph.D. in applied mathematics from the University of Toulouse, France, in 2016. Clément is a recipient of the COAP Best Paper Prize for 2019.

Topics of interest

Numerical optimization, Optimization for machine learning, Randomized algorithms.

Project in Prairie

As the amount of data available and the complexity of the models keep increasing, a number of issues arise in deploying optimization techniques for artificial intelligence at scale. Such challenges have long been integrated in high-performance computing, where the combination of optimization with other fields from numerical linear algebra to differential equations has led to powerful algorithms. This project aims at adopting a similar approach with optimization methods for data science.

Quote

My research aims at developing optimization methods for artificial intelligence that leverage existing methodology and advances from scientific computing along two axes. On one hand, we motivate the use of standard algorithmic frameworks for scientific computing in modern learning tasks by proposing practical schemes with complexity guarantees. Our research will aim at analyzing the complexity of classical second-order methods used in scientific computing so as to design frameworks with theoretical grounds and practical appeal for artificial intelligence. On the other hand, we develop derivative-free algorithms for automated parameter tuning of complex data science models. Our setting will be that of expensive, black-box systems for which a number of parameters require calibration.

Team

GOYENS Florentin
GOYENS Florentin
Postdoctoral researcher

Postdoc


DO Virginie

PhD Student

Dauphine - PSL

virginie.do [at] dauphine.eu

Short bio

MSc in Applied Mathematics / Diplôme d’Ingénieur – Ecole Polytechnique

MSc in Social Data Science – University of Oxford

Thesis title

Fairness in machine learning: insights from social choice.

Short abstract

Designing fair algorithms has recently appeared as a major issue in machine learning, and more generally in AI, while it has been studied for long in economics, especially in social choice theory.  My goal is to bring together the notions of fairness of the two communities, and leverage the concepts and mathematical tools of social choice to address the new challenges of fairness in machine learning.

MISHRA Shrey

PhD student

L’Ecole normale supérieure - PSL

Shrey.Mishra [at] ens.fr

Short bio

  • Manipal University (India, BTech)
  • Cesi school of Engineering (Software majors, Ecole de engineer)
  • Munster Technological University (MSc Artificial Intelligence)

Thesis title

Extracting information related to the Scientific Articles published and making a knowledge base out of it, with the application of various AI / Machine learning based techniques.

Short abstract

Every years thousand’s of scientific papers are published in the academia covering various scientific proofs theorems and relations in a form of a Pdf document. I am enrolled in a TheoremKb (A project led by Pierre Senellart) to extract information from the scientific articles while training Machine learning models to identify / relate various documents together based upon the information expressed in the article (including the mathematical proof’s).

ZHOU Anqi

PhD Student

Institut Pasteur

anqi.zhou [at] pasteur.fr

Short bio

  • BSc. Applied Mathematics, BA. Neuroscience
  • MSc. Biotechnology, Brown University, USA

Thesis title

Rapidly identifying therapeutics of Alzheimer’s Disease using millions of Drosophila larvae and amortized inference.

Short abstract

Alzheimer’s Disease (AD) affects millions of people worldwide, yet the limited treatments address only the physiological symptoms instead of the cause of pathogenesis. The goal of this PhD project is to establish a new pipeline for measuring AD phenotypes that leverages the advantages of Drosophila as a model system for circuit studies and links probabilistic behavior to disease progression. The pipeline builds on automated machine learning to rapidly analyze data from millions of larvae.