Study of potential attacks on data and model and their countermeasures in the context of distributed AI for intelligent energy systems

This post-doctoral fellowship is part of the AI-NRGY research project, which aims to propose an AI-based distributed architecture for intelligent energy systems made up of a large number of dynamic components (e.g. smart grids, electric vehicles, renewable energy sources). More specifically, the aim of this post-doc is to protect AI-based services against malicious disruptions that could affect the essential functionality of energy systems. Given the ubiquity of AI systems in modern digitised systems, their potential corruption poses a major threat to critical infrastructures. Two types of threats can be investigated: privacy threats (such as pattern reversal or data mining) and security threats (such as evasion attacks or data poisoning).

Privacy threats have been widely addressed by the scientific community and the CEA has conducted extensive work on integrating and optimising robust cybersecurity primitives. However, emerging security such as model poisoning (which arises from data poisoning) and adversarial attacks now require additional processing. Data poisoning is a cyber attack that can be used to simply compromise the convergence of the learning phase and result in underperforming models, but it can also be used to embed a ‘backdoor’ into the learned model that allows the expected result to be manipulated.

This post-doctoral position will enable the candidate to carry out theoretical and applied research in the field of privacy and security in distributed machine learning, particularly in the context of intelligent energy systems. More specifically, the candidate will study the potential threats of distributed/federated learning, and propose solutions to defend against the attacks identified as the most relevant.

Development of noise-based artifical intellgence approaches

Current approaches to AI are largely based on extensive vector-matrix multiplication. In this postdoctoral project we would like to pose the question, what comes next? Specifically we would like to study whether (stochastic) noise could be the computational primitive that the a new generation of AI is built upon. This question will be answered in two steps. First, we will explore theories regarding the computational role of microscopic and system-level noise in neuroscience as well as how noise is increasingly leveraged in machine leaning and artificial intelligence. We aim to establish concrete links between these two fields and, in particular, we will explore the relationship between noise and uncertainty quantification.
Building on this, the postdoctoral researcher will then develop new models that leverage noise to carry out cognitive tasks, of which uncertainty is an intrinsic component. This will not only serve as an AI approach, but should also serve as a computational tool to study cognition in humans and also as a model for specific brain areas known to participate in different aspects of cognition, from perception to learning to decision making and uncertainty quantification.
Perspectives of the postdoctoral project should inform how future fMRI imaging and invasive and non-invasive electrophysiological recordings may be used to test theories of this model. Additionally, the candidate will be expected to interact with other activates in the CEA related to the development of noise-based analogue AI accelerators.

LLMs hybridation for requirements engineering

Developing physical or digital systems is a complex process involving both technical and human challenges. The first step is to give shape to ideas by drafting specifications for the system to be. Usually written in natural language by business analysts, these documents are the cornerstones that bind all stakeholders together for the duration of the project, making it easier to share and understand what needs to be done. Requirements engineering proposes various techniques (reviews, modeling, formalization, etc.) to regulate this process and improve the quality (consistency, completeness, etc.) of the produced requirements, with the aim of detecting and correcting defects even before the system is implemented.
In the field of requirements engineering, the recent arrival of very large model neural networks (LLMs) has the potential to be a "game changer" [4]. We propose to support the work of the functional analyst with a tool that facilitates and makes reliable the writing of the requirements corpus. The tool will make use of a conversational agent of the transformer/LLM type (such as ChatGPT or Lama) combined with rigorous analysis and assistance methods. It will propose options for rewriting requirements in a format compatible with INCOSE or EARS standards, analyze the results produced by the LLM, and provide a requirements quality audit.

Development of Algorithms for the Detection and Quantification of Biomarkers from Voltammograms

The objective of the post-doctoral research is to develop a high-performance algorithmic and software solution for the detection and quantification of biomarkers of interest from voltammograms. These voltammograms are one-dimensional signals obtained from innovative electrochemical sensors. The study will be carried out in close collaboration with another laboratory at CEA-LIST, the LIST/DIN/SIMRI/LCIM, which will provide dedicated and innovative electrochemical sensors, as well as with the start-up USENSE, which is developing a medical device for measuring multiple biomarkers in urine.

X-ray tomography reconstruction based on analytical methods and Deep-Learning

CEA-LIST develops the CIVA software platform, a reference for the simulation of non-destructive testing processes. In particular, it proposes tools for X-ray and tomographic inspection, which allow, for a given tomographic testing, to simulate all the radiographic projections (or sinogram) taking into account various associated physical phenomena, as well as the corresponding tomographic reconstruction.
The proposed work is part of the laboratory's contribution to a European project on tomographic testing of freight containers with inspection systems using high-energy sources. The spatial constraints of the projection acquisition stage (the trucks carrying the containers pass through an inspection gantry) imply an adaptation of the geometry of the source/detector system and consequently of the corresponding reconstruction algorithm. Moreover, the system can only generate a reduced number of projections, which makes the problem ill-posed in the context of inversion.
The expected contributions concern two distinct aspects of the reconstruction methodology from the acquired data. On the one hand, it is a question of adapting the analytical reconstruction methods to the specific acquisition geometry of this project, and on the other hand, to work on methods allowing to overcome the lack of information related to the limited number of radiographic projections. In this objective, supervised learning methods, more specifically by Deep-Learning, will be used both to complete the sinogram, and to reduce the reconstruction artifacts caused by the small number of projections available. A constraint of adequacy to the data and the acquisition system will also be introduced in order to generate physically coherent projections.

Development of artificial intelligence algorithms for narrow-band localization

Narrowband (NB) radio signals are widely used in the context of low power, wide area (LPWA) networks, which are one of the key components of the Internet-of-Things (NB-IoT). However, because of their limited bandwidth, such signals are not well suited for accurate localization, especially when used in a complex environment like high buildings areas or urban canyons, which create signals reflections and obstructions. One approach to overcome these difficulties is to use a 3D model of the city and its buildings in order to better predict the signal propagation. Because this modelling is very complex, state-of-the art localization algorithms cannot handle it efficiently and new techniques based on machine learning and artificial intelligence should be considered to solve this very hard problem. The LCOI laboratory has deployed a NB-IoT network in the city of Grenoble and is currently building a very large database to support these studies.
Based on an analysis of the existing literature and using the knowledge acquired in the LCOI laboratory, the researcher will
- Contribute and supervise the current data collection.
- Exploit existing database to perform statistical analysis and modelling of NB-IoT signal propagation in various environments.
- Develop a toolchain to simulate signal propagation using 3D topology.
- Refine existing performance bounds through a more accurate signal modelling.
- Develop and implement real-time as well as off line AI-based localization algorithms using 3D topology.
- Evaluate and compare developed algorithms with respect to SoTA algorithms.
- Contribute to collaborative or industrial projects through this research work.
- Publish research papers in high quality journals and conference proceedings.

Development of a digital twin of complex processes

The current emergence of new digital technologies is opening up new opportunities for industry, making production more efficient, safer, more flexible and more reliable than ever. The application of these technologies to the vitrification processes could improve the knowledge of the processes, optimise their operation, train operators, help with predictive maintenance and assist in the management of the process.
The SOSIE project aims at providing a first proof of concept for the implementation of digital technologies in the field of vitrification processes, by integrating virtual reality, augmented reality, IoT (Internet of Things) and Artificial Intelligence.
This project, carried out in collaboration between the CEA and the SME GAMBI-M, is a READYNOV project. GAMBI-M is a company specialised in the reconstruction of complex environments and in digital engineering. The work will be carried out in close collaboration with the CEA teams developing the vitrification processes for nuclear waste.
The project consists of developing a digital twin of 2 vitrification processes, and will be implemented on 2 platforms in parallel, one in a conventional zone, the other in a high activity zone. The first step will be to develop a visual digital twin, the virtual 3D model of each cell, which will allow the user to visit the cells and access any point virtually. Based on this reconstructed model, an "augmented" twin will be developed and connected to the supervisory controller. Finally, the last step will be to develop the "intelligent twin" by exploiting existing databases on the operation of the process. By training machine learning algorithms on these data, a predictive model of nominal operation will be generated.
Publications are expected on the implementation of virtual reality and augmented reality tools on shielded chain operations, as well as on the development of deep learning methods for the assistance to the control of such complex processes.

Hybrid CMOS / spintronic circuits for Ising machines

The proposed research project is related to the search for hardware accelerators for solving NP-hard optimization problems. Such problems, for which finding exact solutions in polynomial time is out of reach for deterministic Turing machines, find many applications in diverse fields such as logistic operations, circuit design, medical diagnosis, Smart Grid management etc.
One approach in particular is derived from the Ising model, and is based on the evolution (and convergence) of a set of binary states within an artificial neural network (ANN).In order to improve the convergence speed and accuracy, the network elements may benefit from an intrinsic and adjustable source of fluctuations. Recent proof-of-concept work highlights the interest of implementing such neurons with stochastic magnetic tunnel junctions (MTJ).

The main goals will be the simulation, dimensioning and fabrication of hybrid CMOS/MTJ elements. The test vehicles will then be characterized in order to validate their functionality.

This work will be carried out in the frame of a scientific collaboration between CEA-Leti and Spintec.

Post-doctoral position in AI safety and assurance at CEA LIST

The position is related to safety assessment and assurance of AI (Artificial Intelligence)-based systems that used machine-learning components during operation time for performing autonomy functions. Currently, for non-AI system, the safety is assessed prior to the system deployment and the safety assessment results are compiled into a safety case that remains valid through system life. For novel systems integrating AI components, particularly the self-learners systems, such engineering and assurance approach are not applicable as the system can exhibit new behavior in front of unknown situations during operation.

The goal of the postdoc will be to define an engineering approach to perform accurate safety assessment of AI systems. A second objective is to define assurance case artefacts (claims, evidences, etc.) to obtain & preserve justified confidence in the safety of the system through its lifetime, particularly for AI system with operational learning. The approach will be implemented in an open-source framework that it will be evaluated on industry-relevant applications.

The position holder will join a research and development team in a highly stimulating environment with unique opportunities to develop a strong technical and research portfolio. He will be required to collaborate with LSEA academic & industry partners, to contribute and manage national & EU projects, to prepare and submit scientific material for publication, to provide guidance to PhD students.

Digital circuit design for In-Memory Computing in advanced Resistive-RAM NVM technology

For integrated circuits to be able to leverage the future “data deluge” coming from the cloud and cyber-physical systems, the historical scaling of Complementary-Metal-Oxide-Semiconductor (CMOS) devices is no longer the corner stone. At system-level, computing performance is now strongly power-limited and the main part of this power budget is consumed by data transfers between logic and memory circuit blocks in widespread Von-Neumann design architectures. An emerging computing paradigm solution overcoming this “memory wall” consists in processing the information in-situ, owing to In-Memory-Computing (IMC).
CEA-Leti launched a project on this topic, leveraging three key enabling technologies, under development at CEA-Leti: non-volatile resistive memory (RRAM), new energy-efficient nanowire transistors and 3D-monolithic integration [ArXiv 2012.00061]. A 3D In-Memory-Computing accelerator circuit will be designed, manufactured and measured, targeting a 20x reduction in (Energy x Delay) Product vs. Von-Neumann systems.

Top