Batteries recycling :Development and understanding of a new deactivation concept of lithium ion domestic batteries
Domestic lithium ion batteries gather all batteries used in electronic devices, mobile phone, and tooling applications. By 2030, the domestic lithium-ion battery market will increase up to 30%. With the new European recycling regulation and the emergency to find greener and safer recycling process, it is today necessary to develop new deactivation process of domestic lithium ion batteries.
The process has to address several lithium ion chemistries, be continuous, safe, controllable and low cost.
To develop this new concept, the first step will be to define the most appropriate chemical systems. Then these chemical systems will be tested in a dedicated experimental laboratory setup using chemistry and electrochemistry, allowing the simulation of real conditions of domestic batteries deactivation.
The third step will be to characterize, understand and validate the electrochemical and physico chemical mechanisms. The last step will be to participate to the validation of the deactivation concept on a real object (a lap top battery) in representative conditions (on the abuse tests plateform of CEA).
Unsupervised Few-Shot Detection of Signal Anomalies
Our laboratory, located at Digiteo in CEA Saclay, is looking for a postdoc candidate working on the subject of anomaly detection in manufacturing processes, for a duration of 18 months starting from Feburary 2022. This postdoc is part of HIASCI (Hybridation des IA et de la Simulation pour le Contrôle Industriel), a CEA LIST project in an internal collaboration which aims at building a platform of AI methods and tools for manufacturing applications, ranging from quality control to process monitoring. Our laboratory contributes to HIASCI by developping efficient methods of anomaly detection in acoustic or vibrational signals, operating with small amounts of training data. In this context, the detection of signal anomalies (DSA) consists of extracting from data the information about the physical process of manufacturing, which is in general too complex to be fully understood. Moreover, real data of abnormal states are relatively scarce and often expensive to collect. For these reasons we privilege a data-driven approach under the framework of Few-Shot Learning (FSL).
Scalable digital architecture for Qubits control in Quantum computer
Scaling Quantum Processing Units (QPU) to hundreds of Qubits leads to profound changes in the Qubits matrix control: this control will be split between its cryogenic part and its room temperature counterpart outside the cryostat. Multiple constraints coming from the cryostat (thermal or mechanical constraints for example) or coming from Qubits properties (number of Qubits, topology, fidelity, etc…) can affect architectural choices. Examples of these choices include Qubits control (digital/analog), instruction set, measurement storage, operation parallelism or communication between the different accelerator parts for example. This postdoctoral research will focused on defining a mid- (100 to 1,000 Qubits) and long-term (more than 10,000 Qubits) architecture of Qubits control at room temperature by starting from existing QPU middlewares (IBM QISKIT for example) and by taking into account specific constraints of the QPU developed at CEA-Leti using solid-state Qubits.
Application of formal methods for interferences management
Within a multidisciplinary technological research team of experts in SW/HW co-design tools by applying formal methods, you will be involved in a national research project aiming at developing an environment to identify, analyze and reduce the interferences generated by the concurrent execution of applications on a heterogeneous commercial-off-the-shelf (COTS) multi-core hardware platform.
Design of in-memory high-dimensional-computing system
Conventional von Neumann architecture faces many challenges in dealing with data-intensive artificial intelligence tasks efficiently due to huge amounts of data movement between physically separated data computing and storage units. Novel computing-in-memory (CIM) architecture implements data processing and storage in the same place, and thus can be much more energy-efficient than state-of-the-art von Neumann architecture. Compared with their counterparts, resistive random-access memory (RRAM)-based CIM systems could consume much less power and area when processing the same amount of data. This makes RRAM very attractive for both in-memory and neuromorphic computing applications.
In the field of machine learning, convolutional neural networks (CNN) are now widely used for artificial intelligence applications due to their significant performance. Nevertheless, for many tasks, machine learning requires large amounts of data and may be computationally very expensive and time consuming to train, with important issues (overfitting, exploding gradient and class imbalance). Among alternative brain-inspired computing paradigm, high-dimensional computing (HDC), based on random distributed representation, offers a promising way for learning tasks. Unlike conventional computing, HDC computes with (pseudo)-random hypervectors of D-dimension. This implies significant advantages: a simple algorithm with a well-defined set of arithmetic operations, with fast and single-pass learning that can benefit from a memory-centric architecture (highly energy-efficient and fast thanks to a high degree of parallelism).
Quantum dot auto-tuning assisted by physics-informed neural networks
Quantum computers hold great promise for advancing science, technology, and society by solving problems beyond classical computers' capabilities. One of the most promising quantum bit (qubit) technologies are spin qubits, based on quantum dots (QDs) that leverage the great maturity and scalability of semiconductor technologies. However, scaling up the number of spin qubits requires overcoming significant engineering challenges, such as the charge tuning of a very large number of QDs. The QD tuning process implies multiple complex steps that are currently performed manually by experimentalists, which is cumbersome and time consuming. It is now crucial to address this problem in order to both accelerate R&D and enable truly scalable quantum computers.
The main goal of the postdoctoral project is to develop a QD automatic tuning software combining Bayesian neural networks and a QD physical model fitted on CEA-Leti’s device behavior. This innovative approach leveraging the BayNN uncertainty estimations and the predictive aspect of QD models will enable to achieve fast and non-ideality-resilient automatic QD tuning solutions.
Numerical Meta-modelization based study of the propagation of ultrasonic waves in piping system with corroded area
The aim of the ANR project PYRAMID (http://www.agence-nationale-recherche.fr/Projet-ANR-17-CE08-0046) is to develop some technics of detection and quantification of the wall thinning due to flow accelerated corrosion in piping system. In the framework of this project involving French and Japanese laboratories, CEA LIST develops new numerical tools based on finite elements dedicated to the modelling of an ultrasonic guided wave diffracted by the corrosion in an elbow pipe. These solutions support the design of an inspection process based on electromagnetic-acoustic transduction (EMAT). To this end, the ability of CEA LIST to adapt meta-modeling tools of its physical models will be the key asset to allow intensive use of the simulation.
Machine learning technics and knowledge-based simulator combined for dynamic process state estimation
This project aims to estimate the real state of a dynamic process for liquid-liquid extraction through the real data record. Data of this kind are uncertain due to exogenous variables. They are not included inside the simulator PAREX+ dedicated to the dynamic process. So, the first part of the project is to collect data from simulator. By this way the operational domain should be well covered and the dynamic response recorded. Then, the project focuses to solve the inverse problem by using convolutionnal neural networks on times series. Maybe a data enrichment could be necessary to perfect zones and improve estimations. Finally, the CNN will be tested on real data and integrate the uncertainty inside its estimations.
At the end, the model built needs to be used in operational conditions to help diagnosis and improve the real-time control to ensure that the dynamic observed is the one needed.
Hydrothermal carbonization as a pretreatment of wastes before their thermochemical conversion by gasification
Gasification, a thermochemical transformation generally performed at about 850°C, produces a gas that can be valorised in cogeneration, or for the synthesis of chemical products or fuels. Some bottlenecks are still present mainly for the gasification of biogenic or fossil origin wastes: irregular feeding in the reactor due to the heterogeneity in form and composition; formation of inorganic gaseous pollutants (HCl, KCl, NaCl, H2S) or organic ones (tars), which are harmful for the process and/or decrease its efficiency, and must be removed before the final application.
The objective of the post-doctoral work will be to test and optimize a pre-treatment step of the resource based on hydrothermal carbonisation (HTC). This transformation is performed at 180-250°C, in a wet and pressurised environment (2-10 MPa). The principal product is a carbonaceous solid residue (hydrochar), that can be valorised by gasification. HTC aims to limit the release of inorganic and organic pollutants in gasification, and to homogenise and improve the physical properties of the resource.
The proposed approach will consist in: experimentations in batch reactors on pre-selected resources and model materials, together with quantification and analyses of products; analysis of results aiming at elucidating the links between the resource and the properties of the hydrochar, as a function of operating conditions; an evaluation of mass and energy balances for the HTC-gasification process.
Development of processing by Artificial Intelligence of a measuring and forecasting station
This post-doctoral proposal is part of the French atomic commission (CEA) project "MultiMod'Air", which involves developing an « intelligent » prototype of air quality measurement and forecasting station within two years. The work proposal is to develop the bricks of Artificial Intelligence (AI) of the project: correction by ANN (Artificial Neuronal network) of the measurements obtained through low cost sensors, correction ANN of weather forecasts at the station level, which are simple treatments to implement. The actual research work will concern the development of a AI based pollution forecast at the station by learning from past events.