detection of multiplets and application to turkey-Syria seismic crisis of february 2023

The correlation technique, or template matching, applied to the detection and analysis of seismic events has demonstrated its performance and usefulness in the processing chain of the CEA/DAM National Data Center. Unfortunately, this method suffers from limitations which limit its effectiveness and its use in the operational environment, linked on the one hand to the computational cost of massive data processing, and on the other hand to the rate of false detections that could generate low-level processing. The use of denoising methods upstream of processing (example: deepDenoiser, by Zhu et al., 2020), could also increase the number of erroneous detections. The first part of the research project consists of providing a methodology aimed at improving the processing time performance of the multiplets detector, in particular by using information indexing techniques developed in collaboration with LIPADE (L-MESSI method , Botao Peng, Panagiota Fatourou, Themis Palpanas. Fast Data Series Indexing for In-Memory Data. International Journal on Very Large Data Bases (VLDBJ) 2021). The second part of the project concerns the development of an auto-encoder type “filtering” tool for false detections built using machine learning. The Syria-Turkey seismic crisis of February 2023, dominated by two earthquakes of magnitude greater than 7.0, will serve as a learning database for this study.

Functional renormalization group for nuclear structure theory

The atomic nucleus is the epitome of complexity : it is a strongly correlated system of nucleons (which are themselves composite degrees of freedom) coupled via the strong and electroweak interactions which features a wealth of emergent behaviors (deformation, supefluidity, clustering, ...). The long term endeavor of nuclear structure theory is to understand and predict how an arbitrary number of nucleons self-organize and become disorganized in nuclei. Among the various theoretical frames, the energy density functional (EDF) method, close, yet different from the density functional theory, provides the best compromise between the robustness of the description and its numerical complexity. However, the phenomenological ingredients entering the formulation of standard EDFs affect their predictive power.
The postdoctoral project aims at formulating the EDF approach from first principles, in order to benefit from a theoretical frame with both a maximal predictive power and a favorable numerical cost. The supervising team has identified the functional renormalization group (FRG) as the most relevant language for such a non empirical reformulation of the EDF method.
The present projet aims at formulating the EDF method from first principles via the FRG.

ML assisted RF filter design

Experimental investigation of reacting flow on Palladium hydrides

Evolution of ISAAC and Xpn codes for an extension of the QRPA method to the complete processing of odd nuclei; towards a database without interpolation for odd nuclei

The treatment of odd-isospin nuclei in microscopic approaches is currently limited to the so-called «blocking» approximation. In the Hartree-Fock Bogolyubov (HFB) approach, the ground state of an odd-mass nucleus is described as a one-particle excitation (qp) on its reference vacuum. Thus, in the QRPA approach, where the basic excitations are states «with 2 quasi-particles», the blocked qp is excluded from the valence space under the Pauli exclusion principle. As a result, the chosen qp is a spectator and is not involved in the QRPA collective states. If the single nucleon should have a significant contribution some levels will not be reproduced. The development in the QRPA codes (ISAAC and Xpn) of a procedure that allows all nucleons to participate in collective states is mandatory for a microscopic description of odd nuclei. Moreover, recent Xpn developments have allowed the description of forbidden ß- first decays improving the estimation of half-life time of fission fragments. This could be extended to address ß+ and electronic captures and could be adapted to large-scale calculations useful for nuclear astrophysics.

Crystal plasticity in classical molecular dynamics and mesoscopic upscaling

Thanks to new supercomputer architectures, classical molecular dynamics simulations will soon enter the realm of a thousand billion atoms, never before achieved, thus becoming capable of representing the plasticity of metals at the micron scale. However, such simulations generate a considerable amount of data, and the difficulty now lies in their exploitation in order to extract the statistical ingredients relevant to the scale of "mesoscopic" plasticity (the scale of continuous models).
The evolution of a material is complex, as it depends on lines of crystalline defects (dislocations) whose evolution is governed by numerous mechanisms. In order to feed models at higher scales, the quantities to be extracted are the velocities and lengths of dislocations, as well as their evolution over time. These data can be extracted using specific analysis techniques based on characterization of the local environment ('distortion score', 'local deformation'), a posteriori or in situ during simulation. Finally, machine learning tools can be used to analyze the statistics obtained and extract and synthesize (by model reduction) a minimal description of plasticity for models at higher scales.

Design of a high-energy phase contrast radiography chain

As part of hydrodynamic experiments carried out at CEA-DAM, the laboratory is seeking, using pulsed X-ray imaging, to radiograph thick objects (several tens of mm), made of low-density materials (around 1 g/cm3), inside which shock waves propagate at very high speeds (several thousand m/s). For this type of application, it is necessary to use energetic X-ray sources (beyond 100 keV). Conventional X-ray imaging, which provides contrast due to variations in absorption cross sections, proves insufficient to capture the small density variations expected during the passage of the shock wave. A theoretical study recently carried out in the laboratory showed that the complementary exploitation of the information contained in the X-ray phase should enable better detectability. The aim of the post-doctorate is to provide experimental proof of concept for this theoretical study. For greater ease of implementation, the work will mainly focus on the dimensioning of a static X-ray chain, where the target is stationary and the source emits continuous X-ray radiation. Firstly, the candidate will have to characterize in detail the spectrum of the selected X-ray source as well as the response of the associated detector. In a second step, he (she) will design and have manufactured interference gratings adapted to high-energy phase measurements, as well as a representative model of the future moving objects to be characterized. Finally, the student will carry out radiographic measurements and compare them with predictive simulations. The student should have a good knowledge of radiation-matter interaction and/or physical and geometric optics. Proficiency in object-oriented programming and/or the Python and C++ languages would be a plus.

Fusion of 3D models derived from optical and radar images

Thanks to satellite and airborne imagery, 3D reconstruction of earth surface is possible. Optical imagery exploits stereoscopic acquisitions and photogrammetry to retrieve 3D surface whereas interferometry is used for radar imagery. These techniques are complementary. Radar images allow the retrieval of fine metallic objects such as pylons. Optical imagery is more robust but such fine details cannot be preserved due to smoothing. An objective of the post-doctorate is to detect such fine objects.
The complementarity between 3D cloud points retrieved from satellite optical imagery and satellite and airborne radar imagery should lead to a 3D product including objects principally detected by radar and surface reconstruction derived from optical imagery.
The post-doctorate will begin with a state of the art review on 3D reconstruction by optical and radar imagery as well as cloud points fusion. Different 3D reconstruction processing chains should be used on airborne and satellite images. A precise registration algorithm and fusion algorithm on cloud points should be developed, enabling the detection of points detected only by radar. For this step, Deep Learning techniques could be useful. The results will be compared to 3D very high resolution acquired by Lidar to quantify the results quality of the proposed algorithm.
This post-doctorate will take place in labs specialized in satellite and radar image processing through a collaboration between CEA-DAM and Onera.

Modeling of laser-matter interaction for hypervelocity impact simulation

Hypervelocity impacts (HVI) are an important issue for various aerospace, geophysical or large laser facility protection applications such as the Laser Megajoule. In these applications, impact speeds can range from a few km/s to tens of km/s. Below 10 km/s, gas or powder guns can be used to launch projectiles at representative speeds. For higher velocities (10 to 50 km/s), the use of laser-generated shocks is an interesting alternative.
However, the analogy between HVI and laser shocks relies on a good modeling of the laser-matter interaction mechanisms, and in particular of the 2D effects that affect the pressure field at the target surface.
The objective of this postdoctoral fellowship is to study the laser-matter interaction with numerical tools developed at the CEA, in particular the 1D code Esther and the 2D/3D code Troll. The simulations will be validated by comparison with experimental data and will then be used to conduct parametric studies on the spatial and temporal profiles of the laser beam.

Slope stability analysis of the Mururoa atoll by probabilistic approach and construction of a weighted database of gravity origin tsunami models on the Nice region