Autonomous perception in spaceborne, airborne and maritime environments poses a fundamental state estimation problem: reconstructing 3D structure and motion from incomplete, noisy multi‑sensor data. The PhD project will develop physics‑aware multi‑sensor (optical, inertial, radar, acoustic, RF) perception and fusion methods for satellite attitude and relative motion estimation, as well as for the navigation of aerial and maritime drones. The work will involve the construction and use of physically grounded simulation and synthetic data pipelines, providing controlled settings to analyse inference, robustness and generalization behaviour of candidate algorithms. Building on this, the research will focus on physics‑constrained learning and hybrid inference schemes that incorporate physical constraints and dynamical models into data‑driven architectures, with the aim of achieving robust, interpretable and transferable state estimation across diverse operational scenarios.
Strong background in theoretical and/or experimental physics, solid programming and numerical skills, and motivation to work on autonomous systems, signal processing and machine learning.

