next up previous
Next: Incertitude propagation Up: Theme 2: Data assimilation Previous: Theme 2: Data assimilation

Finding first moments in data assimilation

Problematic:

Data assimilation procedures, and notably the variational ones, are barely surfacing in atmospheric chemistry, and they are used for non linear systems of very large dimension. Typical applications are air quality forecasting, obtention or enhancement of emission inventory (for example road traffic or carbon emissions), the interpretation of satellite data to draw a map of the atmosphere at global scale.

In this theme, we use and develop approaches for the determination, in a classical manner, of the first two moments (mean and/or mode, variance) of probability distributions for inverted parameters or states.

Objectives:

Objectives will be conducted in cooperation with the use of variational techniques for Chemistry-Transport models (4D var). At the methodological level, the wide range of time scales in the involved processes lead to an ill-posed problem, and the distinction between slow and fast processes is necessary to set up approaches based on inverse modelling. Moreover, many present Chemistry-Transport models do not integrate numerous physical processes (examples: segregation effects related to the coupling between chemistry and turbulence). The consideration of a ``model error'' is a powerful tool in a data assimilation model and its representation (beyond statistical one) remains a challenge. The representation of error covariances matrices (observation and background) can similarly produce an influence on an observation system's efficiency. A possible approach can be the use of inverse modelling for these error parameters.

An increasingly important theme is the ``design'' of observation networks dedicated to environmental applications. For example, in meteorology, the problem is to determine in real time the dispatching of mobile observation sensors (sondes, planes) to enhance an already existing fixed coverage network of measures. The goal is better forecasting a phenomenon in formation (example: cyclogenesis). Such approaches (``targeting'' or ``adaptive observations'' in the classical terminology) were basically developed in the context of sequential data assimilation (using the Kalman filtering process). Their extension to a variational framework for an applicative context remains largely to be done, notably by setting up the so-called ``second order'' approaches (as introduced by F.X. Le Dimet, Idopt project).

In the case of linear tracers (for example: radionuclide), the use of variational methods is not essential and the so-called ``retroplumes'' methods can be formulated, with the goal of finding sources for transport models. The problematics about source localization and reconstruction remain largely open in the case of diffuse sources (example: carbon sinks and sources).

Lastly, from a numerical point of view, variational methods necessitate the setting up of the adjoint models. Approaches based on automatic differentiation may lead to a systematic process. For example, the POLAIR code (ENPC) was differentiated with the help of the Odyssée software (INRIA). A cooperation with the INRIA project Tapenade, on these questions, will be foreseen.


next up previous
Next: Incertitude propagation Up: Theme 2: Data assimilation Previous: Theme 2: Data assimilation
Christine Anocq 2004-11-23