Project Details
MARTIAN: Machine-Learning-Based Automated Microscopy for Real-Time 3D Particle Tracking, Analysis, and Feedback Control of Dynamical Systems
Applicant
Dr.-Ing. Özgün Yavuz
Subject Area
Image and Language Processing, Computer Graphics and Visualisation, Human Computer Interaction, Ubiquitous and Wearable Computing
Automation, Mechatronics, Control Systems, Intelligent Technical Systems, Robotics
Methods in Artificial Intelligence and Machine Learning
Optics, Quantum Optics and Physics of Atoms, Molecules and Plasmas
Automation, Mechatronics, Control Systems, Intelligent Technical Systems, Robotics
Methods in Artificial Intelligence and Machine Learning
Optics, Quantum Optics and Physics of Atoms, Molecules and Plasmas
Term
since 2026
Project identifier
Deutsche Forschungsgemeinschaft (DFG) - Project number 571772459
This project aims to develop an experimental platform integrating ultrafast hybrid microscopy with machine learning (ML) frameworks for real-time three-dimensional tracking, pattern analysis, and control of driven colloidal systems. While bright-field microscopy is effective for lateral imaging, it faces challenges in resolving depth information due to limited contrast at small focal separations. Interferometric and holographic techniques, particularly when enhanced by ML, improve sensitivity to phase variations; however, analysing densely packed particle assemblies remains difficult. We will focus on an experimental system which consists of polystyrene nanoparticles confined between glass slides and driven by Marangoni flows induced by ultrafast laser pulses. These conditions lead to the formation of complex, self-organised structures such as Moiré patterns and quasi-periodic arrangements. Capturing the three-dimensional dynamics of these evolving patterns requires new imaging strategies. We will develop a hybrid illumination system combining coherent (holographic) and incoherent (bright-field) imaging, enabling ultrafast acquisition with sub-10 ms exposures. A GPU-based data pipeline will provide low-latency processing, essential for real-time feedback. We will design ML algorithms for particle localisation, depth estimation, and pattern classification. Initially, model-based tracking methods will be employed, followed by deep learning techniques, such as ResNet and YOLO, adapted for high-speed, high-density imaging. Synthetic datasets generated by Mie scattering theory and finite-difference time-domain (FDTD) simulations, alongside experimental data, will be used for model training. Feature extraction methods, including bond orientation and Voronoi analysis, will feed into supervised classifiers like random forests. Integrating information from hybrid illumination will improve three-dimensional structural recognition, particularly for complex, multilayered systems. Finally, reinforcement learning (RL) strategies based on actor–critic models will dynamically optimise experimental parameters, guiding pattern formation and discovering rare dynamic events. Entropy-based metrics will be incorporated into reward functions to detect transitions. By integrating real-time imaging, ML analysis, and experimental control within a closed-loop framework, this project will create a new methodology for exploring non-equilibrium systems relevant to physics, biology, and materials science.
DFG Programme
Research Grants
