Amphithéâtre Marguerite de Navarre, Site Marcelin Berthelot
Open to all
-

Abstract

The aim of this opening lecture is to raise awareness of the mathematical modeling of complex phenomena, to explain why we try to simulate them and what we can expect from them. The notion of using mathematical models to represent certain phenomena is introduced as early as high school, but the scope and applications of these models are often poorly understood. In high school, lectures are organized around two main objectives :

  • Express, under clearly stated hypotheses, in mathematical language what observers describe in sentences.
  • Examine the validity of hypotheses by comparing experimental results with model predictions.

We soon realize that most of the models we use in our daily work - especially in companies - have no explicit analytical solution. They must therefore be simulated digitally : like actors realistically acting out a scene, computer systems are given the task of embodying each parameter, changing the history of variables and virtually recreating the behavior of the phenomenon under study. These digital " actors " are capable of performing up to 1021,  floating-point operations per second.

However, even with such a colossal expression rate, the " actors " may prove incapable of preserving the fidelity of models in highly complex domains - whether it be the weather forecasts entrusted to Météo-France or industrial optimizations.

The aim of this lecture is therefore to present approaches to complexity reduction that enable :

  • take full advantage of existing computing capacities,
  • scrupulously respect the model's " scenario ",
  • and provide real-time results, paving the way for digital twins.

This complexity reduction draws on approximation theory, functional and numerical analysis, discretization, numerical simulation algorithm design , data assimilation and, more recently, the use of neural networks and machine learning.