NEW PATHS TO PRODUCE NEUTRON RICH HEAVY NUCLEI
One of the strongest research projects in recent years has emerged from a critical, unresolved question about the natural origin of nuclei heavier than iron. The closed neutron shell, N = 126, as the final waiting point in the r-process (rapid neutron capture process), plays an essential role in the formation of these nuclei. However, recent efforts to synthesize superheavy elements and explore N = 126 neutron-rich nuclei have faced significant challenges due to extremely low cross sections using traditional fusion-evaporation reactions.
These factors highlight the urgent need for alternative reaction mechanisms. One alternative has been identified in multinucleon transfer (MNT) reactions, which offer a promising route to neutron-rich heavy nuclei. The challenge is to isolate the desired nuclei from the multitude of products generated during the reaction.
We have been working on this reaction mechanism for several years, performing experiments at Argonne National Laboratory and other international laboratories.
The aim of this thesis is to analyse the data collected during the Argonne experiment (end 2023) and to propose a new experiment at the spectrometer Prisma (Legnaro National Lab) coupled with the Agata germanium detector.
Machine-learning methods for the cosmological analysis of weak- gravitational lensing images from the Euclid satellite
Weak gravitational lensing, the distortion of the images of high-redshift galaxies due to foreground matter structures on large scales, is one
of the most promising tools of cosmology to probe the dark sector of the Universe. The statistical analysis of lensing distortions can reveal
the dark-matter distribution on large scales, The European space satellite Euclid will measure cosmological parameters to unprecedented accuracy. To achieve this ambitious goal, a number of sources of systematic errors have to be quanti?ed and understood. One of the main origins of bias is related to the detection of galaxies. There is a strong dependence on local number density and whether the galaxy's light emission overlaps with nearby
objects. If not handled correctly, such ``blended`` galaxies will strongly bias any subsequent measurement of weak-lensing image
distortions.
The goal of this PhD is to quantify and correct weak-lensing detection biases, in particular due to blending. To that end, modern machine-
and deep-learning algorithms, including auto-di?erentiation techniques, will be used. Those techniques allow for a very e?cient estimation
of the sensitivity of biases to galaxy and survey properties without the need to create a vast number of simulations. The student will carry out cosmological parameter inference of Euclid weak-lensing data. Bias corrections developed during this thesis will be included a prior in galaxy shape measurements, or a posterior as nuisance parameters. This will lead to measurements of cosmological parameters with an reliability and robustness required for precision cosmology.
Bayesian Inference with Differentiable Simulators for the Joint Analysis of Galaxy Clustering and CMB Lensing
The goal of this PhD project is to develop a novel joint analysis for the DESI galaxy clustering
and Planck PR4/ACT CMB lensing data, based on numerical simulations of the surveys and
state-of-the-art machine learning and statistical inference techniques. The aim is to overcome
many of the limitations of the traditional approaches and improve the recovery of cosmological
parameters. The joint galaxy clustering - CMB lensing inference will significantly improve
constraints on the growth of structure upon DESI-only analyses and refine even more the test of general relativity.
Source clustering impact on Euclid weak lensing high-order statistics
In the coming years, the Euclid mission will provide measurements of the shapes and positions of billions of galaxies with unprecedented precision. As the light from the background galaxies travels through the Universe, it is deflected by the gravity of cosmic structures, distorting the apparent shapes of galaxies. This effect, known as weak lensing, is the most powerful cosmological probe of the next decade, and it can answer some of the biggest questions in cosmology: What are dark matter and dark energy, and how do cosmic structures form?
The standard approach to weak lensing analysis is to fit the two-point statistics of the data, such as the correlation function of the observed galaxy shapes. However, this data compression is sub- optimal and discards large amounts of information. This has led to the development of several approaches based on high-order statistics, such as third moments, wavelet phase harmonics and field-level analyses. These techniques provide more precise constraints on the parameters of the cosmological model (Ajani et al. 2023). However, with their increasing precision, these methods become sensitive to systematic effects that were negligible in the standard two-point statistics analyses.
One of these systematics is source clustering, which refers to the non-uniform distribution of the galaxies observed in weak lensing surveys. Rather than being uniformly distributed, the observed galaxies trace the underlying matter density. This clustering causes a correlation between the lensing signal and the galaxy number density, leading to two effects: (1) it modulates the effective redshift distribution of the galaxies, and (2) it correlates the galaxy shape noise with the lensing signal. Although this effect is negligible for two-point statistics (Krause et al. 2021, Linke et al. 2024), it significantly impacts the results of high-order statistics (Gatti et al. 2023). Therefore, accurate modelling of source clustering is critical to applying these new techniques to Euclid’s weak lensing data.
In this project, we will develop an inference framework to model source clustering and asses its impact on cosmological constraints from high-order statistics. The objectives of the project are:
1. Develop an inference framework that populates dark matter fields with galaxies, accurately modelling the non-uniform distribution of background galaxies in weak lensing surveys.
2. Quantify the source clustering impact on the cosmological parameters from wavelet transforms and field-level analyses.
3. Incorporate source clustering in emulators of the matter distribution to enable accurate data modelling in the high-order statistics analyses.
With these developments, this project will improve the accuracy of cosmological analyses and the realism of the data modelling, making high-order statistics analyses possible for Euclid data.
Detecting the first clusters of galaxies in the Universe in the maps of the cosmic microwave background
Galaxy clusters, located at the node of the cosmic web, are the largest gravitationally bound structures in the Universe. Their abundance and spatial distribution are very sensitive to cosmological parameters, such the matter density in the Universe. Galaxy clusters thus constitute a powerful cosmological probe. They have proven to be an efficient probe in the last years (Planck, South Pole Telescope, XXL, etc.) and they are expected to make great progress in the coming years (Euclid, Vera Rubin Observatory, Simons Observatory, CMB- S4, etc.).
The cosmological power of galaxy clusters increases with the size of the redshift (z) range covered by the catalogue. Planck detected the most massive clusters in the Universe in the redshift range 0<z<1. SPT and ACT are more sensitive but covered less sky: they detected tens of clusters between z=1 and z=1.5, and a few clusters between z=1.5 and z=2. The next generation of instruments (Simons Observatory starting in 2025 and CMB- S4 starting in 2034) will routinely detect clusters in 1<z<2 and will observe the first clusters formed in the Universe in 2<z<3.
Only the experiments studying the cosmic microwave background will be able to observe the hot gas in these first clusters at 2<z<3, thanks to the SZ effect, named after its discoverers Sunyaev and Zel’dovich. This effect is due to high energetic electrons of the gas, which distorts the frequency spectrum of the cosmic microwave background, and is detectable in current experiments. But the gas is not the only component emitting in galaxy clusters: galaxies inside the clusters can also emit in radio or in infrared, contaminating the SZ signal. This contamination is weak at z<1 but increases drastically with redshift. One expects that the emission from radio and infrared galaxies in clusters are of the same order of magnitude as the SZ signal in 2<z<3.
One thus needs to understand and model the emission of the gas as a function of redshift, but also the emission of radio and infrared galaxies inside the clusters to be ready to detect the first clusters in the Universe. Irfu/DPhP developed the first tools for detecting clusters of galaxies in cosmic microwave background data in the 2000s. These tools have been used successfully on Planck data and on ground-based data, such as the data from the SPT experiment. They are efficient at detecting clusters of galaxies whose emission is dominated by the gas, but their performance is unknown when the emission from radio and infrared galaxies is significant.
This thesis will first study and model the radio and infrared emission from galaxies in the clusters detected in the cosmic microwave background data (Planck, SPT and ACT) as a function of redshift.
Secondly, one will quantify the impact of these emissions on existing cluster detection tools, in the redshift range currently being probed (0<z<2) and then in the future redshift range (2<z<3).
Finally, based on our knowledge of these radio and infrared emissions from galaxies in clusters, we will develop a new cluster extraction tool for high redshift clusters (2<z<3) to maximize the detection efficiency and control selection effects, that is the number of detected clusters compared to the total number of clusters.
The PhD student will join the Simons Observatory and CMB-S4 collaborations.
The biased Cosmic web, from theoretical modelling to observations
The study of the filamentary Cosmic Web is a paramount aspect of modern research in cosmology. With the advent of extremely large and precise cosmological datasets which are now (or within months) coming notably from the Euclid space mission, it becomes feasible to study in detail the formation of cosmic structures through gravitational instability. In particular, fine non-linear aspects of this dynamics can be studied from a theoretical point of view with the hope of detecting signatures in real observations. One of the major difficulty in this regard is probably to make the link between the observed distribution of galaxies along filaments and the underlying matter distribution for which first-principles models are known. Building on recent and state of the art theoretical developments in gravitational perturbation theory and constrained random field theory, the successful candidate will develop first-principles predictions for statistical observables (extrema counts, topological estimators, extrema correlation functions, e.g. Pogosyan et al. 2009, MNRAS 396 or Ayçoberry, Barthelemy, Codis 2024, A&A 686) of the cosmic web, applied to the actual discrete field of galaxies which only traces the total matter in a biased manner. This model will then be applied to the analysis of Euclid data.
Fission yield measurements for decay heat evaluation of used nuclear fuel
The fission process involves the violent splitting of a heavy nucleus into two fission fragments, resulting in over 300 different isotopes. Understanding the distribution and production of these fragments, known as fission yields, is essential for grasping the underlying mechanisms of fission, which are influenced by nuclear structure and dynamics. Accurate measurements of fission yields are crucial for advancing nuclear energy applications, particularly in developing Generation IV reactors and recycling spent nuclear fuel. The VAMOS magnetic spectrometer enables precise fission yield measurements due to its large acceptance and identification capabilities for various isotopes. An experimental campaign at VAMOS in 2024 utilized beams of (^{238})U and (^{232})Th on a carbon target to produce fissioning actinides. The combination of VAMOS with a new Silicon telescope (PISTA) enhances data quality significantly. The candidate will analyze VAMOS data to produce high-resolution fission yields and study uncerta
High Harmonic Generation in cavity for an attosecond quantum source
Attophysics is at the forefront of time-resolved spectroscopy. Indeed, it harnesses the shortest light pulse probe that can be produced experimentally, thanks to the high harmonic generation (HHG) process. A standard way to trigger HHG is to submit an atomic system to an oscillating electromagnetic field whose strength compares with the Coulomb potential bounding electrons with their nuclei. This non-linear, non-perturbative optical effect produces a broadband coherent radiation in the extreme ultraviolet (XUV) frequency range, which forms attosecond pulses (1e-18 s). Since its discovery in the late 1980s, continuous experimental and theoretical efforts have been dedicated to get a complete understanding of this complex phenomenon. Despite the tremendous success of attosecond science, there is still no consensus about a quantum description of the process. We foresee that such a description of HHG would push forward our understanding of non-linear optics and open up new perspectives for attosecond science.
Systematic study of the neutron scattering reactions on structural materials of interest for nuclear reactor applications
Elastic and inelastic scattering reactions on structural materials have a significant impact on the simulation of neutron transport. The nuclear data of structural materials of interest for nuclear reactors and criticality studies must be known with good precision over a wide incident neutron energy range, from a few tens of meV to several MeV. The thesis proposal aims to carry out a systematic study of the scattering reactions above the resolved resonance range up to 5 MeV. In this energy range, neither the R-Matrix formalism nor the statistical Hauser-Feshbach model are valid for structural materials. A new formalism will be developed by using high-resolution measurements of the scattering angular distributions. This work will focus more precisely on measurements already done at the JRC-Geel facility (sodium [1], iron [2]) and will be extended to other elements studied within the framework of the IAEA/INDEN project, such as copper, chromium and nickel. As part of this thesis, the experimental database will be complemented by new measurements on the copper isotopes (Cu63 and Cu65). The measurements will be carried out at JRC Geel GELINA facility with the ELISA detector. Concerning the copper isotopes, integral benchmarks from the ICSBEP database revealed several issues in the nuclear data libraries, which provide contradictory integral feedbacks on the nuclear data of U235. For example, the ZEUS benchmarks, which is routinely used to study the capture cross section of U235 in the fast neutron energy range, are very sensitive to the nuclear data of copper. This type of benchmark will provide an ideal framework for quantifying the impact of any new formalism developed to evaluate the nuclear data of structural materials.
This study will allow the PhD student to develop skills in experimental and theoretical nuclear physics, as well as in neutron physics. The results will be communicated to the JEFF working group of the Nuclear Energy Agency (OCDE/AEN).
[1] P. Archier, Contribution à l’amélioration des données nucléaires neutroniques du sodium pour le calcul des réacteurs de génération IV, Thèse, Université de Grenoble, 2011.
[2] G. Gkatis, Study of neutron induced reaction cross sections on Fe isotopes at the GELINA facility relevant to reactor applications, Thèse, Université Aix-Marseille, 2024.
Integral measurement of fission products capture cross-section using a combination of oscillation and activation techniques
This thesis is proposed as part of the POSEIDON (Fission Product Oscillation Experiments for Improving Depletion Calculations) project that deals with the integral measurement of the neutron capture and scattering cross-sections of the main fission products contributing to the reactivity loss in irradiated fuel. It consists of measuring the reactivity effect of separated isotope samples using a pile oscillation device, coupled with neutron activation measurements, in three different core spectral configurations : thermal, PWR and epithermal.
Part of the work will be done at CEA IRESNE in Cadarache and part at the Research Center of the Czech Republic, CV Rez. The PhD student will be involved in testing and optimizating the oscillation device that is currently being designed, as well as performing the measurements in the LR-0 Czech experimental reactor. The work at Cadarache will be on the analysis of the measurements with MC simulation tools. Functionalities needed for data analysis will require additional developments of the codes by the student.
The expected impact is a better prediction of the reactivity loss in reactor cores as a function of burn-up. Indeed, even with the most recent international nuclear data libraries, there is an important bias in the estimation of this reactivity loss.
The PhD student will develop competences in experimental and theoretical neutronics. Following job opportunities include R&D laboratories and nuclear industry.