Development and optimization of adaptive mesh refinement methods for fluid/structure interaction problems in a context of high performance computing
A new simulation code for structural and compressible fluid mechanics, named Manta, is currently under development at the french CEA. This code aims at both unifying the features of CEA’s legacy implicit and explicit codes and being natively HPC-oriented. With its many numerical methods (Finite Elements, Finite Volumes, hybrid methods, phase field, implicit or explicit solvers …), Manta enables the simulation of various static or dynamic kinds mechanical problems including fluids, structures, or fluid-structure interactions.
When looking for optimizing computation time, Adaptive Mesh Refinement (AMR) is a typical method for increasing numerical accuracy while managing computational load.
This postdoctoral position aims at defining and implementing parallel AMR algorithms in a high performance computing context, for fluid/structure interaction problems.
In a preliminary step, the functionalities for hierarchical AMR, such as cell refinement and coarsening, field transfers from parents to children cells, refinement criteria or hanging nodes management, will be integrated in Manta. This first work will probably rely on external libraries that should be identified.
In a second step, the distributed-memory parallel performances will be optimized. Especially, strategies for load balancing between the MPI processes should be studied, especially for fluid/structure interaction problems.
Finally, especially for explicit in time computations, one will have to define and implement spatially adapted time stepping to cope with the several levels of refinement and the different wave propagation velocities.
These last 2 points will give rise to some publications in specialized scientific journals.
Use and extension of the Alien solver library with the proto-application Helix
First, the post-doc candidate will have to integrate the solver Library Alien into Helix to carry out performance and usability assessments in iterative or direct solver configuration. These assessments will be done on different computer architecture from desktop computer to national supercomputer with thousands of cores.
In a second time, the candidate will deal with the possibility to add new functionalities in the Alien library to solve non-linear systems composed with equations and inequations to be able to solve, in an HPC context, mechanical problems like phase field problem or contact problems, problems often still opened in the community. The results will be compared to the classical test cases and benchmarks of the state of the art in the domain.
The candidate will join the Helix development team, formed by 3/4 developers for the moment in the laboratory LM2S (15 persons). A transversal program between CEA directions finances the post-doc and the candidate will collaborate with the Alien library developers at the DAM of CEA.
Study of a transient regime of helium dispersion to simulate an accidental release of hydrogen from a fuel cell.
CEA and industrial partners want to improve their knowledge, models and risk mitigation means for the conséquences of an accidental release of hydrogen from a H2 Fuel Cell. The dispersion of helium as a replacement for hydrogen takes place in a private garage and the transient state will be studied. Different scenarios of release are considered: from a cubic idealized fuel cell, then with different aspect ratios and finally with varying main dimension. The goal is to study some scaling effects. For the first case, we will measure helium concentration with katarometres and possibly velocity fields with PIV methods. Then mitigation processes will be tested. At last comparisons with models and numerical simulations will be performed.
Simulation of supercritical helium flows in the cooling circuits of tokamaks
Future fusion reactors such as tokamaks (ITER, DEMO) will have to demonstrate the safety of their systems, validated by thermal hydraulic codes. To meet this requirement, the CATHARE code has been chosen as scientific computing tool. The work will consist in adapting the CATHARE code to helium at low temperatures and then to benchmark it with other thermal hydraulic codes used by the DRF (Direction de la Recherche Fondamentale), as well as with experimental data available at CEA Grenoble.
The study will be threefold. The first phase will be dedicated to a literature survey on the thermal hydraulics of helium, featured by closing equations for monophasic helium (friction and heat transfer coefficients). In a second step, the engineer will implement these laws in the code and perform some validations tests. The last part will focus a benchmark based on three applications: the study of a cryo-pump, the study of a supercritical helium discharge and the study of a superconducting cable.
Numerical quality analysis of simulation codes with CADNA, Verificarlo and Verrou
Numerical codes rely on floating-point arithmetic to represent real numbers and the operations applied to them. However, in general, real numbers cannot be exactly represented by floating-point numbers. The finite precision of the floating-point arithmetic may lead to round-off errors that may accumulate. With the increasing computational power, the algorithm complexification and the coupling of numerical codes, it is crucial to quantify the numerical robustness of an application or an algorithm.
CADNA [1], Verificarlo [2] and Verrou [3] are dedicated tools that allow estimating the round-off errors propagation and measuring the numerical accuracy of the obtained results. The objective of this work is to use these three tools on GYSELA [4, 5], a simulation code used to characterize the plasma dynamics in Tokamak, and PATMOS [6], a mini-app representative of a Monte Carlo neutron transport code. This analysis will be aimed at assessing the numerical robustness of these two applications or some of their algorithms. In addition to the analysis of the numerical quality, these tools will also be used to see whether it is possible to lower the precision (simple or even half precision instead of double) of some algorithms, thus improving the memory footprint and/or performances (vectorization, communications). Beyond the lessons learnt on the two analyzed codes, a second objective will be the elaboration of a methodology that could be more generic and be applied more broadly to other codes.
Development of multiphysics tools dedicated to the modeling of FSR and associated studies.
The sodium group of DM2S (department of CEA Saclay) develops numerical coupling tools in order to realize accidental case studies (fast transient). The physical domains concerned are neutronics, thermo-hydraulics and mechanics. The subject of this post-doc deals within this framework.
The aim is to carry out several studies: the integration of a coupling within the CORPUS platform, to carry out studies in order to test (and introduce) in the coupling the impact of the deformation of the assemblies by the Temperature on the flow of liquid sodium, the use of the neutronic cross sections generated by the code APOLLO3, the study of other accidental cases, and extend the modeling to the subchannel and pin scales.
Development of a computational framework dedicated to model order reduction by certified reduced basis method.
Many engineering fields require to solve numerically partial differential equations (PDE) modeling physical phenomenon.
When we focus on a mathematical model that describes the physical behavior of a system based on one or more parametrized PDEs (geometrical or physical parameters), it may be desirable to rapidly and reliably evaluate the output of the model (quantity of interest)
for different parameter values.
The real-time context, needed to perform command-control, and contexts asking many evaluations of model outputs (typically for optimization methods or uncertainty and sensitivity analysis) lend themselves perfectly.
The certified reduced basis method is an intrusive reduction method beacause, unkike non-intrusive methods, the reduction is based on the projection of operators associated to physical model PDEs.
This method allow to obtain rapidly, for a given set of parameter values, an approximation of the evaluation of the model output.
One of the strengths of the method is the "certified" aspect to estimate the approximation error of the model output evaluation.
The goal of the post-doctorate is to develop a computational framework for the certified reduced basis method. This framework should be based on the TRUST platform (https://sourceforge.net/projects/trust-platform/) developed at CEA and will be generic enough to be used to deal with different types of problems (linear or not, stationary or not, coercive or not...)
The framework will be used in the case of a two-fluid mixing model.
Continuum models calibration strategy based on a 3D discrete approach
In order to develop an identification strategy for continuum constitutive models devoted to quasi-brittle materials, suited for structural analysis, often realized arbitrarily, a model based on the discrete element method has been formulated. The discrete model is used to compensate the lack of experimental data required to calibrate the continuum model. Thanks to intrinsic predispositions with respect to fracture mechanisms, the discrete model can be used easily, and its efficiency has been proved. However, only 2D simulations have been undertaken so far, mostly due to computational costs limitations.
A 2D framework reduces extensively analysis possibilites with such model, in particular for reinforced structures where 3D effects are predominant. The purpose of the present post-doctoral work is to extend to 3D the discrete approach already developped in 2D. The developments will be integrated in the FEA code CAST3M-CEA developped by DEN/DANS/DM2S/SEMT. In the mean time, the discrete model will be optimized using available tools, such as solvers, available in the CAST3M-CEA environment. Depending on the computational costs improvements, even complete structures simulations might be considered.
At the end of this work, the developed numerical tool will allow to extend the identification stragegy to constitutive models including 3D effects, such as steel/concrete interface models (confinement) and concrete model (dilatancy).
Nonlinear dynamic analysis of a reinforced concrete structure subjected to seismic loadings: Deterministic and probabilistic study of response spectra
The proposed work is based on the experimental campaign of the ENISTAT project and is composed of three parts:
1. Calibration and enhancement of the numerical model (5 months)
Based on the nonlinear numerical model that has been realized in CEA, the applicant will have to compare the results to those provided by the experimental campaign. The potential gaps will be interpreted and the model should be calibrated (and/or enhanced) to ensure a satisfactory accordance with the experimental results and observations.
2. Deterministic and probabilistic analysis of response spectra (5 months)
Based of the numerical model that will have been calibrated, the response spectra will be computed in given points. They will be compared the demand spectra prescribed by the design rules such as the EC8. Based on probabilistic methods that are developed in CEA for seismic applications, the uncertainties not only of the input parameters but also of the seismic signals will be taken into account. The induced variability of the response spectra will be quantified and discussed. One can notice that the knowledge of these data is particularly interesting since design rules in seismic engineering are based on them.
3. Study of the effect of the thermic brick elements
Thanks to the experimental results, not only experimental but also numerical, a discussion on the effect of the thermic brick elements will be realized with the aim to draw first conclusions on their effect on the overall structural behavior under seismic loading.