Application of formal methods for interferences management
Within a multidisciplinary technological research team of experts in SW/HW co-design tools by applying formal methods, you will be involved in a national research project aiming at developing an environment to identify, analyze and reduce the interferences generated by the concurrent execution of applications on a heterogeneous commercial-off-the-shelf (COTS) multi-core hardware platform.
Design of in-memory high-dimensional-computing system
Conventional von Neumann architecture faces many challenges in dealing with data-intensive artificial intelligence tasks efficiently due to huge amounts of data movement between physically separated data computing and storage units. Novel computing-in-memory (CIM) architecture implements data processing and storage in the same place, and thus can be much more energy-efficient than state-of-the-art von Neumann architecture. Compared with their counterparts, resistive random-access memory (RRAM)-based CIM systems could consume much less power and area when processing the same amount of data. This makes RRAM very attractive for both in-memory and neuromorphic computing applications.
In the field of machine learning, convolutional neural networks (CNN) are now widely used for artificial intelligence applications due to their significant performance. Nevertheless, for many tasks, machine learning requires large amounts of data and may be computationally very expensive and time consuming to train, with important issues (overfitting, exploding gradient and class imbalance). Among alternative brain-inspired computing paradigm, high-dimensional computing (HDC), based on random distributed representation, offers a promising way for learning tasks. Unlike conventional computing, HDC computes with (pseudo)-random hypervectors of D-dimension. This implies significant advantages: a simple algorithm with a well-defined set of arithmetic operations, with fast and single-pass learning that can benefit from a memory-centric architecture (highly energy-efficient and fast thanks to a high degree of parallelism).
Quantum dot auto-tuning assisted by physics-informed neural networks
Quantum computers hold great promise for advancing science, technology, and society by solving problems beyond classical computers' capabilities. One of the most promising quantum bit (qubit) technologies are spin qubits, based on quantum dots (QDs) that leverage the great maturity and scalability of semiconductor technologies. However, scaling up the number of spin qubits requires overcoming significant engineering challenges, such as the charge tuning of a very large number of QDs. The QD tuning process implies multiple complex steps that are currently performed manually by experimentalists, which is cumbersome and time consuming. It is now crucial to address this problem in order to both accelerate R&D and enable truly scalable quantum computers.
The main goal of the postdoctoral project is to develop a QD automatic tuning software combining Bayesian neural networks and a QD physical model fitted on CEA-Leti’s device behavior. This innovative approach leveraging the BayNN uncertainty estimations and the predictive aspect of QD models will enable to achieve fast and non-ideality-resilient automatic QD tuning solutions.
Numerical Meta-modelization based study of the propagation of ultrasonic waves in piping system with corroded area
The aim of the ANR project PYRAMID (http://www.agence-nationale-recherche.fr/Projet-ANR-17-CE08-0046) is to develop some technics of detection and quantification of the wall thinning due to flow accelerated corrosion in piping system. In the framework of this project involving French and Japanese laboratories, CEA LIST develops new numerical tools based on finite elements dedicated to the modelling of an ultrasonic guided wave diffracted by the corrosion in an elbow pipe. These solutions support the design of an inspection process based on electromagnetic-acoustic transduction (EMAT). To this end, the ability of CEA LIST to adapt meta-modeling tools of its physical models will be the key asset to allow intensive use of the simulation.
Machine learning technics and knowledge-based simulator combined for dynamic process state estimation
This project aims to estimate the real state of a dynamic process for liquid-liquid extraction through the real data record. Data of this kind are uncertain due to exogenous variables. They are not included inside the simulator PAREX+ dedicated to the dynamic process. So, the first part of the project is to collect data from simulator. By this way the operational domain should be well covered and the dynamic response recorded. Then, the project focuses to solve the inverse problem by using convolutionnal neural networks on times series. Maybe a data enrichment could be necessary to perfect zones and improve estimations. Finally, the CNN will be tested on real data and integrate the uncertainty inside its estimations.
At the end, the model built needs to be used in operational conditions to help diagnosis and improve the real-time control to ensure that the dynamic observed is the one needed.
Hydrothermal carbonization as a pretreatment of wastes before their thermochemical conversion by gasification
Gasification, a thermochemical transformation generally performed at about 850°C, produces a gas that can be valorised in cogeneration, or for the synthesis of chemical products or fuels. Some bottlenecks are still present mainly for the gasification of biogenic or fossil origin wastes: irregular feeding in the reactor due to the heterogeneity in form and composition; formation of inorganic gaseous pollutants (HCl, KCl, NaCl, H2S) or organic ones (tars), which are harmful for the process and/or decrease its efficiency, and must be removed before the final application.
The objective of the post-doctoral work will be to test and optimize a pre-treatment step of the resource based on hydrothermal carbonisation (HTC). This transformation is performed at 180-250°C, in a wet and pressurised environment (2-10 MPa). The principal product is a carbonaceous solid residue (hydrochar), that can be valorised by gasification. HTC aims to limit the release of inorganic and organic pollutants in gasification, and to homogenise and improve the physical properties of the resource.
The proposed approach will consist in: experimentations in batch reactors on pre-selected resources and model materials, together with quantification and analyses of products; analysis of results aiming at elucidating the links between the resource and the properties of the hydrochar, as a function of operating conditions; an evaluation of mass and energy balances for the HTC-gasification process.
Development of processing by Artificial Intelligence of a measuring and forecasting station
This post-doctoral proposal is part of the French atomic commission (CEA) project "MultiMod'Air", which involves developing an « intelligent » prototype of air quality measurement and forecasting station within two years. The work proposal is to develop the bricks of Artificial Intelligence (AI) of the project: correction by ANN (Artificial Neuronal network) of the measurements obtained through low cost sensors, correction ANN of weather forecasts at the station level, which are simple treatments to implement. The actual research work will concern the development of a AI based pollution forecast at the station by learning from past events.
DTCO analysis of MRAM for In/Near-Memory Computing
The energy cost associated to moving data across the memory hierarchy has become a limiting factor in modern computing systems. To mitigate this trend, novel computing architectures favoring a more local and parallel processing of the stored information are proposed, under the labels « Near/In-Memory Computing » or « Processing In Memory ». Substantial benefits are expected in particular for computationally complex (e.g. combinatorial optimization, graph analysis, cryptography) and data-intensive tasks (e.g. video stream analysis, bio-informatics). Such applications are especially demanding in terms of endurance, latency and density. SRAM, fulfilling the first two criteria, may eventually suffer from its footprint and static power consumption. This prompts the evaluation of alternative denser and non-volatile memory technologies, with magnetoresistive memories (MRAM) currently leading in terms of speed-endurance trade-off.
The primary objective will be to estimate improvements brought by MRAM in terms of array-level power, performance, area (PPA), as compared to SRAM-based on-chip memories and for advanced technology nodes. The candidate will establish an analysis and benchmarking workflow for various classes of MRAM, and optimize single bit cells based on a compact model for the memory element. This baseline approach will then be adapted to functional variations specific to IMC in order to assess the benefits of MRAM on an integrated test vehicle.
Combinatorial optimization of base materials for the design of new materials
The design of new materials is a field of growing interest, especially with the emergence of additive manufacturing processes, thin film deposition, etc. In order to create new materials to target properties of interest for an application area, it is often necessary to mix several raw materials.
A physicochemical modeling of the reactions that occur during this mixing is often very difficult to obtain, especially when the number of raw materials increases. We want to free ourselves as much as possible from this modeling. From experimental data and business knowledge, the goal of this project is to create a symbolic AI capable of groping for the optimal mixture to achieve one or more given properties. The idea is to adapt existing methods of operations research, such as combinatorial optimization, in a context of imprecise knowledge.
We will focus on different use cases such as electric batteries, solvents for photovoltaic cells and anti-corrosion materials.
Within the project, you will:
• Study the state of the art,
• Propose one or several algorithms to prototype, and their evaluation,
• Disseminate the resulting innovations to the consortium and the scientific community, through presentations, contributions to technical reports and / or scientific publications.
Maximum duration: 18-24 months (regarding your experience).
Modelling and evaluation of the future e-CO2 refinery
In the context of achieving carbon neutrality by 2050, the CEA has initiated a project in 2021 to assess the relevance of coupling a nuclear power system with a direct atmospheric carbon capture device (DAC) thanks to the use of the system's waste heat.
As a member of a team of about twenty experts(energy system evaluation, techno-economic engineering, energy system modeling, optimization and computer programming), you will participate in a research project on the modeling and evaluation of a CO2 refinery dedicated to the production of Jet Fuel fed by a nuclear reactor and coupled with an atmospheric CO2 capture process.