



This PhD aims to design analog liquid neural networks for generative edge intelligence. Current neuromorphic architectures, although more efficient through in-memory computing, remain limited by their extreme parameter density and interconnection complexity, making their hardware implementation costly and difficult to scale. The Liquid Neural Networks (LNN), introduced by MIT at the algorithmic level, represent a breakthrough: continuous-time dynamic neurons capable of adjusting their internal time constants according to the input signal, thereby drastically reducing the number of required parameters.
The goal of this PhD is to translate LNN algorithms into circuit-level implementations, by developing ultra-low power time-mode cells based on oscillators that reproduce liquid dynamics, and interconnecting them into a stable, recurrent architecture to target generative AI tasks. A silicon demonstrator will be designed and validated, paving the way for a new generation of liquid neuromorphic systems for Edge AI.

