Development of artificial intelligence algorithms for narrow-band localization

Narrowband (NB) radio signals are widely used in the context of low power, wide area (LPWA) networks, which are one of the key components of the Internet-of-Things (NB-IoT). However, because of their limited bandwidth, such signals are not well suited for accurate localization, especially when used in a complex environment like high buildings areas or urban canyons, which create signals reflections and obstructions. One approach to overcome these difficulties is to use a 3D model of the city and its buildings in order to better predict the signal propagation. Because this modelling is very complex, state-of-the art localization algorithms cannot handle it efficiently and new techniques based on machine learning and artificial intelligence should be considered to solve this very hard problem. The LCOI laboratory has deployed a NB-IoT network in the city of Grenoble and is currently building a very large database to support these studies.
Based on an analysis of the existing literature and using the knowledge acquired in the LCOI laboratory, the researcher will
- Contribute and supervise the current data collection.
- Exploit existing database to perform statistical analysis and modelling of NB-IoT signal propagation in various environments.
- Develop a toolchain to simulate signal propagation using 3D topology.
- Refine existing performance bounds through a more accurate signal modelling.
- Develop and implement real-time as well as off line AI-based localization algorithms using 3D topology.
- Evaluate and compare developed algorithms with respect to SoTA algorithms.
- Contribute to collaborative or industrial projects through this research work.
- Publish research papers in high quality journals and conference proceedings.

Auto-adaptive neural decoder for clinical brain-spine interfacing

CEA/LETI/CLINATEC invite applications for postdoctoral position to work on the HORIZON-EIC project. The project goal is to explore novel solutions for functional rehabilitation and/or compensation for people with sever motor disabilities using auto-adaptive Brain-Machine Interface (BMI) / neuroprosthetics. Neuroprosthetics record, and decode brain neuronal signal for activating effectors (exoskeleton, implantable spinal cord stimulator etc.) directly without physiological neural control command pass way interrupted by spinal cord injury. A set of algorithms to decode neuronal activity recorded at the level of the cerebral cortex (Electrocorticogram) using chronic WIMAGINE implants were developed at CLINATEC and tested in the frame of 2 clinical research protocols in tetraplegics in Grenoble and in paraplegics in Lausanne. The postdoctoral fellow will contribute to the next highly ambitious scientific breakthroughs addressing the medical needs of patients. The crucial improvement of usability may be achieved by alleviating the need of constant BMI decoder recalibration introducing an auto-adaptive framework to train the decoder in an adaptive manner during the neuroprosthetics self-directed use. Auto-adaptive BMI (A-BMI) adds a supplementary loop evaluating from neuronal data the level of coherence between user’s intended motions and effector actions. It may provide BMI task information (labels) to the data registered during the neuroprosthetics self-directed use to be employed for BMI decoder real-time update. Innovative A-BMI neural decoder will be explored and tested offline and in real-time in ongoing clinical trials.

Simulation of a porous medium subjected to high speed impacts

The control of the dynamic response of complex materials (foam, ceramic, metal, composite) subjected to intense solicitations (energy deposition, hypervelocity impact) is a major issue for many applications developed and carried out French Atomic Energy Commission (CEA). In this context, CEA CESTA is developing mathematical models to depict the behavior of materials subjected to hypervelocity impacts. Thus, in the context of the ANR ASTRID SNIP (Numerical Simulation of Impacts in Porous Media) in collaboration with the IUSTI (Aix-Marseille Université), studies on the theme of modeling porous materials are conducted. They aim to develop innovative models that are more robust and overcome the theoretical deficits of existing methods (thermodynamic consistency, preservation of the entropy principle). In the context of this post-doc, the candidate will first do a literature review to understand the methods and models developed within IUSTI and CEA CESTA to understand their differences. Secondly, he will study the compatibility between the model developed at IUSTI and the numerical resolution methods used in the hydrodynamics computing code of the CEA CESTA. He will propose adaptations and improvements of this model to take into account all the physical phenomena that we want to capture (plasticity, shear stresses, presence of fluid inclusions, damage) and make its integration into the computation code possible. After a development phase, the validation of all this work will be carried out via comparisons with other existing models, as well as the confrontation with experimental results of impacts from the literature and from CEA database.

Computational statistics for post-flight analysis in atmospheric reentry

The post-doctorate corresponds to the context of flight tests of an instrumented vehicle (space shuttle, capsule or probe) which enters into the atmosphere. The aim is to reconstruct, from measurements (inertial unit, radar, meteorological balloon, etc.), the trajectory and various quantities of interest, in order to better understand the physical phenomena and to validate the predictive models. We focus on Bayesian statistics, associated with Markov chain Monte Carlo (MCMC) methods. The post-doctoral fellow will develop and extend the proposed approach and will benefit from a scientific collaboration with Audrey Giremus, professor at the University of Bordeaux and specialist in the field. We will in particular try to increase the performance of high dimensional sampling. Special attention will be paid to the machine learning issue of the exploitation of an aerological database. The final objective will consist in developping an evolving software prototype dedicated to the post-flight analysis of flight tests, that exploits the various sources of information. The evaluations will be based on simulated and real data, with comparison to existing tools. The collaboration work will lead to scientific communications and publications.

Development and application of Inverse Uncertainty Quantification methods in thermal-hydraulics within the new OECD/NEA activity ATRIUM

Within the Best Estimate Plus Uncertainty methodologies (BEPU) for the safety analysis of the Nuclear Power Plants (NPPs), one of the crucial issue is to quantify the input uncertainties associated to the physical models in the code. Such a quantification consists of assessing the probability distribution of the input parameters needed for the uncertainty propagation through a comparison between simulations and experimental data. It is usually referred to as Inverse Uncertainty Quantification (IUQ).
In this framework, the Service of Thermal-hydraulics and Fluid dynamics (STMF) at CEA-Saclay has proposed a new international project within the OECD/NEA WGAMA working group. It is called ATRIUM (Application Tests for Realization of Inverse Uncertainty quantification and validation Methodologies in thermal-hydraulics). Its main objectives are to perform a benchmark on relevant Inverse Uncertainty Quantification (IUQ) exercises, to prove the applicability of the SAPIUM guideline and to promote best practices for IUQ in thermal-hydraulics. It is proposed to quantify the uncertainties associated to some physical phenomena relevant during a Loss Of Coolant Accident (LOCA) in a nuclear reactor. Two main IUQ exercises with increasing complexity are planned. The first one is about the critical flow at the break and the second one is related to the post-CHF heat transfer phenomena. A particular attention will be dedicated to the evaluation of the adequacy of the experimental databases for extrapolation to the study of a LOCA in a full-scale reactor. Finally, the obtained input model uncertainties will be propagated on a suitable Integral Effect Test (IET) to validate their application in experiments at a larger scale and possibly justify the extrapolation to reactor scale.

Thermo-aeraulic numerical simulation of an incineration reactor

An incineration and vitrification process devoted to the treatment of apha contaminated organic/metallic wastes originating from MOX production facilities is currently under development at the LPTI laboratory (Laboratoire des Procédés Thermiques Innovants) from the CEA of Marcoule. The development program relies on full scale mock-up investigation tests as well as 3D numerical simulation studies.
The thermo-aeraulic model of the incinerator reactor, developed with the Ansys-Fluent commercial software, is composed of several elementary bricks (plasma, pyrolysis, combustion, particle transportation).
The proposed work consists in improving the model, in particular as regards the pyrolysis and combustion components : chemical reactions, unsteady process… The degree of representativeness of the model will be assessed on the basis of a comparative study using experimental data coming from experiments carried out on the prototype reactor. Besides this development work, various parametric studies will be performed in order to evaluate the impact of various reactor design modifications.
So as to investigate the radiologic behaviour of the reactor during incineration of alpha contaminated wastes, a particle transport model (DPM) associated to a parietal interaction model will be implemented. The simulation results will be compared to experimental data obtained from the analysis of deposits collected on reactor walls (experimental tests are performed with actinides inactive surrogates).

Development and optimization of adaptive mesh refinement methods for fluid/structure interaction problems in a context of high performance computing

A new simulation code for structural and compressible fluid mechanics, named Manta, is currently under development at the french CEA. This code aims at both unifying the features of CEA’s legacy implicit and explicit codes and being natively HPC-oriented. With its many numerical methods (Finite Elements, Finite Volumes, hybrid methods, phase field, implicit or explicit solvers …), Manta enables the simulation of various static or dynamic kinds mechanical problems including fluids, structures, or fluid-structure interactions.

When looking for optimizing computation time, Adaptive Mesh Refinement (AMR) is a typical method for increasing numerical accuracy while managing computational load.

This postdoctoral position aims at defining and implementing parallel AMR algorithms in a high performance computing context, for fluid/structure interaction problems.

In a preliminary step, the functionalities for hierarchical AMR, such as cell refinement and coarsening, field transfers from parents to children cells, refinement criteria or hanging nodes management, will be integrated in Manta. This first work will probably rely on external libraries that should be identified.

In a second step, the distributed-memory parallel performances will be optimized. Especially, strategies for load balancing between the MPI processes should be studied, especially for fluid/structure interaction problems.

Finally, especially for explicit in time computations, one will have to define and implement spatially adapted time stepping to cope with the several levels of refinement and the different wave propagation velocities.

These last 2 points will give rise to some publications in specialized scientific journals.

Use and extension of the Alien solver library with the proto-application Helix

First, the post-doc candidate will have to integrate the solver Library Alien into Helix to carry out performance and usability assessments in iterative or direct solver configuration. These assessments will be done on different computer architecture from desktop computer to national supercomputer with thousands of cores.
In a second time, the candidate will deal with the possibility to add new functionalities in the Alien library to solve non-linear systems composed with equations and inequations to be able to solve, in an HPC context, mechanical problems like phase field problem or contact problems, problems often still opened in the community. The results will be compared to the classical test cases and benchmarks of the state of the art in the domain.
The candidate will join the Helix development team, formed by 3/4 developers for the moment in the laboratory LM2S (15 persons). A transversal program between CEA directions finances the post-doc and the candidate will collaborate with the Alien library developers at the DAM of CEA.

Detection of cyber-attacks in a smart multi-sensor embedded system for soil monitoring

The post-doc is concerned with the application of machine learning methods to detect potential cyber-security attacks on a connected multi-sensor system. The application domain is the agriculture, where CEA Leti has several projects, among which the H2020 project SARMENTI (Smart multi-sensor embedded and secure system for soil nutrient and gaseous emission monitoring). The objective of SARMENTI is to develop and validate a secure, low power multisensor systems connected to the cloud to make in situ soil nutrients analysis and to provide decision support to the farmers by monitoring soil fertility in real-time. Within this topic, the postdoc is concerned with the cyber-security analysis to determine main risks in our multi-sensor case and with the investigation of a attack detection module. The underlying detection algorithm will be based on anomaly detection, e.g., one-class classifier. The work has tree parts, implement the probes that monitor selected events, the communication infrastructure that connects the probes with the detector, and the detector itself.

Data science for heterogeneous materials

In order to predict the functional properties of heterogeneous materials through numerical simulation, reliable data on the spatial arrangement and properties of the constitutive phases is needed. A variety of experimental tools is commonly used at the laboratory to characterize spatially the physical and chemical properties of materials, generating "hyperspectral" datasets. A path to progress towards an improved undestanding of phenomena is the combination of the various imaging techniques using the methods of data science. The objectives of this post-doc is to enrich material knowledge by developping tools to discover correlations in the datasets (for exemple between chemical composition and mechanical behavior), and to increase reliability and confidence in this data by combining techniques and physical constraints. These tools will be applied to datasets of interest regarding cementitious materials and corrosion product layers from archaeological artifacts.

Top