Causal learning

As part of a project that concerns the creation of innovative materials, we wish to strengthen our platform in its ability to learn from little experimental data.

In particular, we wish to work firstly on the extraction of causal links between manufacturing parameters and properties. Causality extraction is a subject of great importance in AI today and we wish to adapt existing approaches to experimental data and their particularities in order to select the variables of interest. Secondly, we will focus on these causal links and their characterization (causal inference) using an approach based on fuzzy rules, that is to say we will create fuzzy rules adapted to their representation.

Researcher in Artificial Intelligence applied to self-driven microfluidic

This postdoctoral position is part of the 2FAST project (Federation of Fluidic Autonomous labs to Speed-up material Tailoring), which is a part of the PEPR DIADEM initiative. The project aims to fully automate the synthesis and online characterization of materials using microfluidic chips. These chips provide precise control and leverage digital advancements to enhance materials chemistry outcomes. However, characterising nano/micro-materials at this scale remains challenging due to its cost and complexity. The 2FAST project aims to utilise recent advances in the automation and instrumentation of microfluidic platforms to develop interoperable and automatically controlled microfluidic chips that enable the controlled synthesis of nanomaterials. The aim of this project is to create a proof of concept for a microfluidic/millifluidic reactor platform that can produce noble metal nanoparticles continuously and at high throughput. To achieve this, feedback loops will be managed by artificial intelligence tools, which will monitor the reaction progress using online-acquired information from spectrometric techniques such as UV-Vis, SAXS, and Raman. The postdoctoral position proposed focuses on AI-related work associated with the development of feedback loop design, creation of a signal database tailored for machine learning, and implementation of machine learning methods to connect various data and/or control autonomous microfluidic devices.

Generative AI for model driven engineering

Generative AI and large language models (LLMs), such as Copilot and ChatGPT can complete code based on initial fragments written by a developer. They are integrated in software development environments such as VS code. Many papers analyse the advantages and limitations of these approaches for code generation. Besides some deficiencies, the produced code is often correct and the results are improving.

However, a surprisingly small amount of work has been done in the context of software modeling. The paper from Cámara et al. concludes that while the performance of the current LLMs for software modeling is still limited (in contrast to code generation), there is a need that (in contrast to code generation) we should adapt our model-based engineering practices to these new assistants and integrate these into MBSE methods and tools.

The goal of this post-doc is to explore generative AI in the context of system modeling and associated tool support. For instance, AI assistance can support the completion, re-factoring and analysis (for instance identified design patterns or anti-patterns) at the model level. Propositions are discussed in the team and in a second step prototyped and evaluated the mechanism in the context of the open-source UML modeler Papyrus.

Development of noise-based artifical intellgence approaches

Current approaches to AI are largely based on extensive vector-matrix multiplication. In this postdoctoral project we would like to pose the question, what comes next? Specifically we would like to study whether (stochastic) noise could be the computational primitive that the a new generation of AI is built upon. This question will be answered in two steps. First, we will explore theories regarding the computational role of microscopic and system-level noise in neuroscience as well as how noise is increasingly leveraged in machine leaning and artificial intelligence. We aim to establish concrete links between these two fields and, in particular, we will explore the relationship between noise and uncertainty quantification.
Building on this, the postdoctoral researcher will then develop new models that leverage noise to carry out cognitive tasks, of which uncertainty is an intrinsic component. This will not only serve as an AI approach, but should also serve as a computational tool to study cognition in humans and also as a model for specific brain areas known to participate in different aspects of cognition, from perception to learning to decision making and uncertainty quantification.
Perspectives of the postdoctoral project should inform how future fMRI imaging and invasive and non-invasive electrophysiological recordings may be used to test theories of this model. Additionally, the candidate will be expected to interact with other activates in the CEA related to the development of noise-based analogue AI accelerators.

Co-design strategy (SW/HW) to enable a structured spatio-temporal sparsity for NN inference/learning

The goal of the project is to identify, analyze and evaluate mechanisms for modulating the spatio-temporal sparsity of activation functions in order to minimize the computational load of transformer NN model (learning/inference). A combined approach with extreme quantization will also be considered.
The aim is to jointly refine an innovative strategy to assess the impacts and potential gains of these mechanisms on the model execution under hardware constraints. In particular, this co-design should also enable to qualify and to exploit a bidirectional feedback loop between a targeted neural network and a hardware instantiation to achieve the best tradeoff (compactness/latency).

LLMs hybridation for requirements engineering

Developing physical or digital systems is a complex process involving both technical and human challenges. The first step is to give shape to ideas by drafting specifications for the system to be. Usually written in natural language by business analysts, these documents are the cornerstones that bind all stakeholders together for the duration of the project, making it easier to share and understand what needs to be done. Requirements engineering proposes various techniques (reviews, modeling, formalization, etc.) to regulate this process and improve the quality (consistency, completeness, etc.) of the produced requirements, with the aim of detecting and correcting defects even before the system is implemented.
In the field of requirements engineering, the recent arrival of very large model neural networks (LLMs) has the potential to be a "game changer" [4]. We propose to support the work of the functional analyst with a tool that facilitates and makes reliable the writing of the requirements corpus. The tool will make use of a conversational agent of the transformer/LLM type (such as ChatGPT or Lama) combined with rigorous analysis and assistance methods. It will propose options for rewriting requirements in a format compatible with INCOSE or EARS standards, analyze the results produced by the LLM, and provide a requirements quality audit.

Autotuning for ultra-high performance computing with partitioned coupling

Taking into account multiple and coupled physics is at the heart of many application needs in fields as varied as, but not limited to, aeronautics, defense and biology. This is also strong area of expertise for CEA's Energy Division, with multiple domains including fluid-structure interaction, neutronics coupled with thermal-hydraulics a/o thermal-mechanics or severe accident modeling. The emergence of exascale architectures opens the way to promising new levels of high-fidelity simulations, but is also significantly increasing the complexity of many software applications in terms of total or partial rewriting. It therefore specifically encourages coupling to limit development work. The idea is to search for each physics of interest in a necessarily reduced number of highly optimized software components, rather than making specific, possibly redundant developments in standalone applications.
Once the coupled multiphysics problem has been written with the expected levels of accurracy and stability, the proposed work concentrates on the resolution algorithms to enable the coupling between applications asssumed to be themselves exascale-compatible, to be solved efficiently at exascale. It is also worth noting that, in general, the couplings under consideration can present a high level of complexity, involving numerous physics with different level of feedback between them and various communications from border exchanges to overlaping domains. The current post-doctoral internship to be carried out in the framework of the ExaMA collaborative project, is in particular dedicated to the identification and dynamic tuning of the relevant numerical parameters arising from the coupling algorithms and impacting the computational efficiency of the global simulation. Considered problems are in the general case time-evolving problems, with a significant number of time iterations allowing using the first iterations to gather data and conduct the tuning.

ML assisted RF filter design

Modeling of electronic components and functions in a radiative environment

Application of formal methods for interferences management

Within a multidisciplinary technological research team of experts in SW/HW co-design tools by applying formal methods, you will be involved in a national research project aiming at developing an environment to identify, analyze and reduce the interferences generated by the concurrent execution of applications on a heterogeneous commercial-off-the-shelf (COTS) multi-core hardware platform.

Top