Project Details
Projekt Print View

Faster and cheaper cosmological data analysis thanks to a physics-driven machine learning strategy

Subject Area Astrophysics and Astronomy
Term since 2021
Project identifier Deutsche Forschungsgemeinschaft (DFG) - Project number 456132154
 
The community of cosmology massively runs Einstein-Boltzmann solvers on computing clusters in the context of cosmological parameter inference from Large Scale Structure and Cosmic Microwave Background data. Any significant speed-up of these codes would result in dramatic efficiency gains and computational cost savings. Despite of on-going effort to reformulate some parts of these codes optimally, these codes still have one bottleneck: the integration of the system of differential equations describing the evolution of cosmological perturbations for the few largest wavenumbers in the problem. This cannot be scaled down by clever parallelisation schemes. We propose to replace this bottleneck by trained neural networks. Our strategy is more universal than previous attempts, because the neutral networks will only replace one intermediate step, that depends only on a reduced number of model parameters (and not on any parameter related to a particular observable related to a precise experiment). We will approach the problem with a physics-driven strategy. We will use semi-analytical results on cosmological perturbation theory to guide the design of very efficient networks, in which good precision can be achieved with small training sets. We will release a module that can be interfaced with any Einstein-Boltzmann solver to speed them up. We will also tweak the structure of parameter inference codes in order to take full advantage of this speed up during parameter extraction runs, even when models beyond the previously existing training set are considered. Finally we will set up a global strategy to avoid duplicate runs throughout the worldwide cosmology community. By setting up public repositories and adding some communication modules in parameter inference codes, we will ensure that the range of validity of the training set and of the neural networks grows with time without generating any extra computing load for any group, not even the group hosting the central repository. This scheme can lead to massive efficiency gains and money savings worldwide. We estimate that the cosmology community could save of the order of a billion CPU-hours every year thanks to this project, and our overall strategy could be replicated in other fields. We will illustrate the performances of the new approach in various cosmological data analysis (parameter inference and sensitivity forecasts) and we will propose to implement it in the pipelines of some collaborations like Euclid.
DFG Programme Research Grants
 
 

Additional Information

Textvergrößerung und Kontrastanpassung