Source clustering impact on Euclid weak lensing high-order statistics

In the coming years, the Euclid mission will provide measurements of the shapes and positions of billions of galaxies with unprecedented precision. As the light from the background galaxies travels through the Universe, it is deflected by the gravity of cosmic structures, distorting the apparent shapes of galaxies. This effect, known as weak lensing, is the most powerful cosmological probe of the next decade, and it can answer some of the biggest questions in cosmology: What are dark matter and dark energy, and how do cosmic structures form?
The standard approach to weak lensing analysis is to fit the two-point statistics of the data, such as the correlation function of the observed galaxy shapes. However, this data compression is sub- optimal and discards large amounts of information. This has led to the development of several approaches based on high-order statistics, such as third moments, wavelet phase harmonics and field-level analyses. These techniques provide more precise constraints on the parameters of the cosmological model (Ajani et al. 2023). However, with their increasing precision, these methods become sensitive to systematic effects that were negligible in the standard two-point statistics analyses.
One of these systematics is source clustering, which refers to the non-uniform distribution of the galaxies observed in weak lensing surveys. Rather than being uniformly distributed, the observed galaxies trace the underlying matter density. This clustering causes a correlation between the lensing signal and the galaxy number density, leading to two effects: (1) it modulates the effective redshift distribution of the galaxies, and (2) it correlates the galaxy shape noise with the lensing signal. Although this effect is negligible for two-point statistics (Krause et al. 2021, Linke et al. 2024), it significantly impacts the results of high-order statistics (Gatti et al. 2023). Therefore, accurate modelling of source clustering is critical to applying these new techniques to Euclid’s weak lensing data.
In this project, we will develop an inference framework to model source clustering and asses its impact on cosmological constraints from high-order statistics. The objectives of the project are:
1. Develop an inference framework that populates dark matter fields with galaxies, accurately modelling the non-uniform distribution of background galaxies in weak lensing surveys.
2. Quantify the source clustering impact on the cosmological parameters from wavelet transforms and field-level analyses.
3. Incorporate source clustering in emulators of the matter distribution to enable accurate data modelling in the high-order statistics analyses.
With these developments, this project will improve the accuracy of cosmological analyses and the realism of the data modelling, making high-order statistics analyses possible for Euclid data.

Numerical twin for the Flame Spray Pyrolysis process

Our ability to manufacture metal oxide nanoparticles (NPs) with well-defined composition, morphology and properties is a key to accessing new materials that can have a revolutionary technological impact, for example for photocatalysis or storage of energy. Among the different nanopowders production technologies, Flame Spray Pyrolysis (FSP) constitutes a promising option for the industrial synthesis of NPs. This synthesis route is based on the rapid evaporation of a solution - solvent plus precursors - atomized in the form of droplets in a pilot flame to obtain nanoparticles. Unfortunately, mastery of the FSP process is currently limited due to too much variability in operating conditions to explore for the multitude of target nanoparticles. In this context, the objective of this thesis is to develop the experimental and numerical framework required by the future deployment of artificial intelligence for the control of FSP systems. To do this, the different phenomena taking place in the synthesis flames during the formation of the nanoparticles will be simulated, in particular by means of fluid dynamics calculations. Ultimately, the creation of a digital twin of the process is expected, which will provide a predictive approach for the choice of the synthesis parameters to be used to arrive at the desired material. This will drastically reduce the number of experiments to be carried out and in consequence the time to develop new grades of materials

Top