Modelling the redshift distribution of Euclid’s lensed galaxies for field-level analyses

The Euclid mission will deliver weak lensing data with unprecedented precision, which has the potential to revolutionise our understanding of dark energy and the growth of cosmic structures. Extracting its full information content requires going beyond the standard analyses. To make optimal use of the data, the OCAPi project will analyse Euclid's lensing maps directly at the pixel level. This approach, known as field-level inference, captures all the information and provides up to 5 times better constraints on the cosmological parameters (Porqueres et al. 2022, 2023).

This increased precision, however, requires an accurate modelling of the data. One of the main calibration challenges in weak lensing surveys is the redshift distribution of the lensed galaxies. Current calibration methods were designed for the standard analyses and may not be sufficiently accurate for field-level techniques. Quantifying the accuracy requirements and developing methods capable of reaching it is essential to enable field-level analyses of Euclid data and unlock the full scientific potential of the survey.

The goal of this PhD project is to develop a new redshift sampler for weak lensing, designed to meet the accuracy requirements of field-level inference. This sampler will combine physical models of galaxy populations with flexible machine-learning techniques. The thesis will contribute to maximising the potential of Euclid's weak lensing data and advance our understanding of the formation of cosmic structures.

SEARCH FOR DIFFUSE EMISSIONS AND SEARCHES IN VERY-HIGH-ENERGY GAMMA RAYS AND FUNDAMENTAL PHYSICS WITH H.E.S.S. AND CTAO

Observations in very-high-energy (VHE, E>100 GeV) gamma rays are crucial for understanding the most violent non-thermal phenomena at work in the Universe. The central region of the Milky Way is a complex region active in VHE gamma rays. Among the VHE gamma sources are the supermassive black hole Sagittarius A* at the heart of the Galaxy, supernova remnants and even star formation regions. The Galactic Center (GC) houses a cosmic ray accelerator up to energies of PeV, diffuse emissions from GeV to TeV including the “Galactic Center Excess” (GCE) whose origin is still unknown, potential variable sources at TeV, as well as possible populations of sources not yet resolved (millisecond pulsars, intermediate mass black holes). The GC should be the brightest source of annihilations of massive dark matter particles of the WIMPs type. Lighter dark matter candidates, axion-like particles (ALP), could convert into photons, and vice versa, in magnetic fields leaving an oscillation imprint in the gamma-ray spectra of active galactic nuclei (AGN).
The H.E.S.S. observatory located in Namibia is composed of five atmospheric Cherenkov effect imaging telescopes. It is designed to detect gamma rays from a few tens of GeV to several tens of TeV. The Galactic Center region is observed by H.E.S.S. for twenty years. These observations made it possible to detect the first Galactic Pevatron and place the strongest constraints to date on the annihilation cross section of dark matter particles in the TeV mass range. The future CTA observatory will be deployed on two sites, one in La Palma and the other one in Chile. The latter composed of more than 50 telescopes will provide an unprecedented scan of the region of the Galactic Center.
The proposed work will focus on the analysis and interpretation of H.E.S.S observations carried out in the Galactic Center region for the search for diffuse emissions (populations of unresolved sources, massive dark matter) as well as observations carried out towards a selection of active galactic nuclei for the search for ALPs constituting dark matter. These new analysis frameworks will be implemented for the CTA data analyses. An involvement in the commissioning of the first MSTs in Chile and in the data analysis for early science are expected.

Point Spread Function Modelling for Space Telescopes with a Differentiable Optical Model

Context

Weak gravitational lensing [1] is a powerful probe of the Large Scale Structure of our Universe. Cosmologists use weak lensing to study the nature of dark matter and its spatial distribution. Weak lensing missions require highly accurate shape measurements of galaxy images. The instrumental response of the telescope, called the point spread function (PSF), produces a deformation of the observed images. This deformation can be mistaken for the effects of weak lensing in the galaxy images, thus being one of the primary sources of systematic error when doing weak lensing science. Therefore, estimating a reliable and accurate PSF model is crucial for the success of any weak lensing mission [2]. The PSF field can be interpreted as a convolutional kernel that affects each of our observations of interest, which varies spatially, spectrally, and temporally. The PSF model needs to be able to cope with each of these variations. We use specific stars considered point sources in the field of view to constrain our PSF model. These stars, which are unresolved objects, provide us with degraded samples of the PSF field. The observations go through different degradations depending on the properties of the telescope. These degradations include undersampling, integration over the instrument passband, and additive noise. We finally build the PSF model using these degraded observations and then use the model to infer the PSF at the position of galaxies. This procedure constitutes the ill-posed inverse problem of PSF modelling. See [3] for a recent review on PSF modelling.

The recently launched Euclid survey represents one of the most complex challenges for PSF modelling. Because of the very broad passband of Euclid’s visible imager (VIS) ranging from 550nm to 900nm, PSF models need to capture not only the PSF field spatial variations but also its chromatic variations. Each star observation is integrated with the object’s spectral energy distribution (SED) over the whole VIS passband. As the observations are undersampled, a super-resolution step is also required. A recent model coined WaveDiff [4] was proposed to tackle the PSF modelling problem for Euclid and is based on a differentiable optical model. WaveDiff achieved state-of-the-art performance and is currently being tested with recent observations from the Euclid survey.

The James Webb Space Telescope (JWST) was recently launched and is producing outstanding observations. The COSMOS-Web collaboration [5] is a wide-field JWST treasury program that maps a contiguous 0.6 deg2 field. The COSMOS-Web observations are available and provide a unique opportunity to test and develop a precise PSF model for JWST. In this context, several science cases, on top of weak gravitational lensing studies, can vastly profit from a precise PSF model. For example, strong gravitational lensing [6], where the PSF plays a crucial role in reconstruction, and exoplanet imaging [7], where the PSF speckles can mimic the appearance of exoplanets, therefore subtracting an accurate and precise PSF model is essential to improve the imaging and detection of exoplanets.

PhD project

The candidate will aim to develop more accurate and performant PSF models for space-based telescopes exploiting a differentiable optical framework and focus the effort on Euclid and JWST.

The WaveDiff model is based on the wavefront space and does not consider pixel-based or detector-level effects. These pixel errors cannot be modelled accurately in the wavefront as they naturally arise directly on the detectors and are unrelated to the telescope’s optic aberrations. Therefore, as a first direction, we will extend the PSF modelling approach, considering the detector-level effect by combining a parametric and data-driven (learned) approach. We will exploit the automatic differentiation capabilities of machine learning frameworks (e.g. TensorFlow, Pytorch, JAX) of the WaveDiff PSF model to accomplish the objective.

As a second direction, we will consider the joint estimation of the PSF field and the stellar Spectral Energy Densities (SEDs) by exploiting repeated exposures or dithers. The goal is to improve and calibrate the original SED estimation by exploiting the PSF modelling information. We will rely on our PSF model, and repeated observations of the same object will change the star image (as it is imaged on different focal plane positions) but will share the same SEDs.

Another direction will be to extend WaveDiff for more general astronomical observatories like JWST with smaller fields of view. We will need to constrain the PSF model with observations from several bands to build a unique PSF model constrained by more information. The objective is to develop the next PSF model for JWST that is available for widespread use, which we will validate with the available real data from the COSMOS-Web JWST program.

The following direction will be to extend the performance of WaveDiff by including a continuous field in the form of an implicit neural representations [8], or neural fields (NeRF) [9], to address the spatial variations of the PSF in the wavefront space with a more powerful and flexible model.

Finally, throughout the PhD, the candidate will collaborate on Euclid’s data-driven PSF modelling effort, which consists of applying WaveDiff to real Euclid data, and the COSMOS-Web collaboration to exploit JWST observations.

References
[1] R. Mandelbaum. “Weak Lensing for Precision Cosmology”. In: Annual Review of Astronomy and Astro- physics 56 (2018), pp. 393–433. doi: 10.1146/annurev-astro-081817-051928. arXiv: 1710.03235.
[2] T. I. Liaudat et al. “Multi-CCD modelling of the point spread function”. In: A&A 646 (2021), A27. doi:10.1051/0004-6361/202039584.
[3] T. I. Liaudat, J.-L. Starck, and M. Kilbinger. “Point spread function modelling for astronomical telescopes: a review focused on weak gravitational lensing studies”. In: Frontiers in Astronomy and Space Sciences 10 (2023). doi: 10.3389/fspas.2023.1158213.
[4] T. I. Liaudat, J.-L. Starck, M. Kilbinger, and P.-A. Frugier. “Rethinking data-driven point spread function modeling with a differentiable optical model”. In: Inverse Problems 39.3 (Feb. 2023), p. 035008. doi:10.1088/1361-6420/acb664.
[5] C. M. Casey et al. “COSMOS-Web: An Overview of the JWST Cosmic Origins Survey”. In: The Astrophysical Journal 954.1 (Aug. 2023), p. 31. doi: 10.3847/1538-4357/acc2bc.
[6] A. Acebron et al. “The Next Step in Galaxy Cluster Strong Lensing: Modeling the Surface Brightness of Multiply Imaged Sources”. In: ApJ 976.1, 110 (Nov. 2024), p. 110. doi: 10.3847/1538-4357/ad8343. arXiv: 2410.01883 [astro-ph.GA].
[7] B. Y. Feng et al. “Exoplanet Imaging via Differentiable Rendering”. In: IEEE Transactions on Computational Imaging 11 (2025), pp. 36–51. doi: 10.1109/TCI.2025.3525971.
[8] Y. Xie et al. “Neural Fields in Visual Computing and Beyond”. In: arXiv e-prints, arXiv:2111.11426 (Nov.2021), arXiv:2111.11426. doi: 10.48550/arXiv.2111.11426. arXiv: 2111.11426 [cs.CV].
[9] B. Mildenhall et al. “NeRF: Representing Scenes as Neural Radiance Fields for View Synthesis”. In: arXiv e-prints, arXiv:2003.08934 (Mar. 2020), arXiv:2003.08934. doi: 10.48550/arXiv.2003.08934. arXiv:2003.08934 [cs.CV].

Understanding the origin of the remarkable efficiency of distant galaxy formation

The James Webb Space Telescope is revolutionizing our understanding of the distant universe. A result has emerged that challenges our models: the extremely high efficiency of star formation in distant galaxies. However, this finding is derived indirectly: we measure the mass of stars in galaxies, not their star formation rate. This is the main weakness of the James Webb. The aim of this thesis is to remedy this weakness by using its angular resolution capacity, which has not been taken into account until now, in order to obtain a more robust measurement of the SFR of distant galaxies. We will deduce a law that will improve the robustness of SFR determination using morphological properties and combining data from the James Webb Space Telescope with data from ALMA (z=1-3). We will then apply it to the distant universe (z=3-6, part 2) and use it as a benchmark for numerical simulations (part 3).

Chasing exo-aurorae

Aurorae are well known optical phenomena in the Solar System planets. Aurorae have great diagnostic value, as their emissions reveal the planets’ atmospheric compositions, the occurrence of magnetic fields and the solar wind conditions at the planet’s orbit. Looking for aurorae on exoplanets and brown dwarfs is the next frontier. A first breakthrough in this direction has occurred recently, with the detection of a CH4 emission attributed to auroral excitation on the brown dwarf W1935. This detection, and the prospects of observing other auroral features with existent and upcoming telescopes, is what motivates this project. In particular, we will build the first model dedicated to investigate CH4 and H3+ auroral emission on exoplanets and brown dwarfs. The model will be used to investigate the conditions at W1935, and to predict the detectability of aurorae on other sub-stellar objects.

Exploring trends in rocky exoplanets observed with JWST

One of JWST’s major goals is to characterize, for the first time, the atmospheres of rocky, temperate exoplanets, a key milestone in the search for potentially habitable worlds. The temperate rocky exoplanets accessible to JWST are primarily those orbiting M-type stars. However, a major question remains regarding the ability of planets orbiting M-dwarfs to retain their atmospheres.
In 2024, an exceptional 500-hour Director’s Discretionary Time (DDT) program, entitled Rocky Worlds, was dedicated to this topic, underlining its strategic importance at the highest level (NASA, STScI).
The main objective of this PhD project is to: 1) Analyze all available JWST/MIRI eclipse data for rocky exoplanets from Rocky Worlds and other public programs using a consistent and homogeneous framework; 2)Search for population-level trends in the observations and interpret them using 3D atmospheric simulations.
Through this work, we aim to identify the physical processes that control the presence and composition of atmospheres on temperate rocky exoplanets.

Numerical Study of Interstellar Turbulence in the Exascale Era

This PhD project aims to better understand interstellar medium turbulence, a key phenomenon governing the formation of stars and galactic structures. This turbulence—magnetized, supersonic, and multiphase—influences how energy is transferred and dissipated, thereby regulating the efficiency of star formation throughout the history of the Universe. Its study is complex, as it involves a wide range of spatial and temporal scales that are difficult to reproduce numerically. Advances in high-performance computing, particularly the advent of GPU-based exascale supercomputers, now make it possible to perform much more refined simulations.

The Dyablo code, developed at IRFU, will be used to carry out large-scale three-dimensional simulations with adaptive mesh refinement to resolve the regions where energy dissipation occurs. The study will progress in stages: first, simulations of simple isothermal flows will be conducted, followed by models that include heating, cooling, magnetic fields, and gravity. The turbulent properties will be analyzed using power spectra, structure functions, and density distributions, in order to better understand the formation of dense regions that give birth to stars. Finally, the work will be extended to the galactic scale, in collaboration with other French institutes, to investigate the large-scale energy cascade of turbulence across entire galaxies.

Cosmology with the Lyman-alpha forest from the DESI cosmological survey.

We use the large-scale distribution of matter in the universe to test our cosmological models. This is primarily done using baryon acoustic oscillations (BAO), which are measured in the two-point correlation function of this distribution. However, the entire matter field contains information at various scales, allowing us to better constrain our models than BAO alone. At redshifts greater than 2, the Lyman-alpha forest is the best probe of this matter distribution. The Lyman-alpha forest is a set of absorption lines measured in the spectra of distant sources. The large DESI spectroscopic survey has collected approximately one million of these spectra. Using the partial data set "DR2," we measured the BAO with an accuracy of 0.7%, which strongly constrains the expansion rate of the universe during the first billion years of its evolution.

This thesis aims to exploit the full set of large-scale Lyman-alpha data from DESI to obtain the strongest constraints on cosmological models possible. First, the student will apply a method known as reconstruction to improve the accuracy of BAO measurements by exploiting information from the matter density field. For the remainder of the thesis, the student will implement a new method known as simulation-based inference. Similar efforts have been carried out in our group with DESI galaxies. In this approach, the entire matter field is used directly to estimate cosmological parameters, particularly dark energy. Thus, the student will make an important contribution to DESI's final cosmological measurements with Lyman-alpha.

An internship is preferred before beginning this thesis.

Unbiased Shear Estimation for Euclid with Automatically Differentiable and GPU Accelerated Modeling

This PhD project focuses on achieving unbiased measurements of weak gravitational lensing — the tiny distortions in galaxy shapes caused by the matter along the line of sight. This technique is key to studying dark matter, dark energy, and gravity, and lies at the heart of the Euclid space mission launched in 2023. Traditional shape-measurement methods introduce systematic biases in shear estimation. The goal of this PhD is to develop and extend an innovative forward-modelling approach that directly infers the shear by simulating realistic galaxy images using deep-learning architectures. The student will adapt this framework to real Euclid data, accounting for the complexity of the Science Ground Segment (SGS) and implementing GPU-accelerated and high-performance computing solutions to scale to the full sky coverage. The project is timely, coinciding with Euclid’s first public data release in 2026. The expected outcome is a more accurate and robust shear estimation method, enabling the next generation of precision cosmology analyses.

Spectro-temporal analysis of Gamma-Ray Burst afterglows detected with SVOM

Gamma-Ray Bursts (GRB) are the most powerful explosions in the Universe. They last a few tens of seconds and emit the same amount of energy as the Sun during its entire lifetime. They gamma-ray emission is followed by a long lasting (hours to days) emission from the X-rays to the radio band. This "afterglow" emission is rich on information about the GRB nearby environnent and host galaxy. SVOM (Space based astronomical Variable Object Monitor) is a Sino-French mission, dedicated to GRB studies, and has been successfully launched in June 2024. It carries a multi-wavelength payload covering gamma-rays/X-rays/optical and includes two dedicated ground based robotic telescopes in Mexico and China.
The PHD project is focussed on the exploitation of the SVOM data for GRBs. The successful candidate will join the MXT science Teal at DAp. MXT is a new type of X-ray telescope, for which the DAp is responsible and its Instrument Centre is also hosted at DAp.
The PHD student will participate actively to the spectral and temporal analysis of MXT data. These data will be compared
to the other data acquired by the SVOM collaboration, especially in the optical an infrared domains.
This dataset will be used as a support to the physical interpretation of GRBs. More specifically, the aspects related to the modeling of the energy injection in the first phases of the afterglow will be used to determine the nature of the compact object at the origin of the relativistic flux, generating the electromagnetic emission observed.

Top