Development of a generally applicable machine learning potential with accurate long-range electrostatic interactions
Theoretical Chemistry: Electronic Structure, Dynamics, Simulation
Final Report Abstract
Machine learning potentials (MLPs) have become an important tool to perform atomistic simulations in chemistry and materials science as they allow to transfer the accuracy of quantum mechanical electronic structure methods to complex systems containing a large number of atoms. A severe limitation of essentially all MLPs available at the start of this project has been their restriction to local properties like atomic energies and charges, which are given as a function of the geometric atomic environments up to a cutoff radius. This locality prevents the construction of MLPs for systems exhibiting non-local phenomena like long-range charge transfer, which are important in many systems, from many organic molecules and their reactions, e.g. protonation and deprotonation, to doped materials like semiconductors. Moreover, conventional MLPs are usually restricted to a fixed global charge state of a system. In this project, we have overcome these limitations by developing a novel type of high-dimensional neural network potential (HDNNP), which combines the advantages of local HDNNPs based on environment-dependent atomic energies, i.e., the accurate description of local bonding, and the charge equilibration neural network technique (CENT), which has been the first MLP being able to deal with long-range charge transfer but is conceptually restricted to ionic systems. We have combined both methods in a consistent way by making use of a global charge equilibration step similar to the CENT approach but using reference atomic charges as target properties. Like in CENT, the required atomic electronegativities are expressed by atomic neural networks, and the resulting atomic charges can be used to compute the long-range electrostatic energy taking the global structure of the system into account. Next to environment-descriptors like atom-centered symmetry functions, these atomic charges are then used in a second step as additional input information for atomic neural networks yielding the short-range interactions between the atoms in a way that is fully consistent with the global charge distribution. Thus, by modifying the training process of CENT as well as the environment-description in HDNNPs and by combining these two components, a new type of HDNNP suitable for the description of non-local charge transfer has been introduced in this project. The capabilities of the method have been tested for a series of non-periodic and periodic model systems like small molecules and clusters as well as adsorption on an oxide slab. The method has been further improved by making use of the charge distribution in the atomic environments. For this purpose, the element-resolved electrostatic potential arising from the charge distribution in the chemical environments has been used as additional information for the atomic neural networks as illustrated for the example of a potential energy surface for sodium chloride. Finally, to distinguish the large variety of MLPs available now in the literature we have suggested a new classification scheme for MLPs by introducing four generations. The first generation of MLPs, which has been used for almost 30 years now, is restricted to very small systems containing up to about six atoms and their degrees of freedom. MLPs suitable for highdimensional systems containing thousands of atoms, which construct the potential energy as a sum of environment-dependent atomic energies, form the second generation. Long-range electrostatic interactions based on local atomic charges define the third generation, while in fourth-generation MLPs non-local phenomena like long-range charge transfer can be considered. Our new method represents a fourth-generation high-dimensional neural network potential (4G- HDNNP), which is generally applicable to all types of systems irrespective of the kind of interactions, from covalent via ionic to metallic bonding.
Publications
-
Funnel hopping Monte Carlo: An efficient method to overcome broken ergodicity. The Journal of Chemical Physics, 152(16).
Finkler, Jonas A. & Goedecker, Stefan
-
A fourth-generation high-dimensional neural network potential with accurate electrostatics including non-local charge transfer. Nature Communications, 12(1).
Ko, Tsz Wai; Finkler, Jonas A.; Goedecker, Stefan & Behler, Jörg
-
An assessment of the structural resolution of various fingerprints commonly used in machine learning. Machine Learning: Science and Technology, 2(1), 015018.
Parsaeifard, Behnam; Sankar De, Deb; Christensen, Anders S.; Faber, Felix A.; Kocer, Emir; De, Sandip; Behler, Jörg; Anatole, von Lilienfeld, O. & Goedecker, Stefan
-
Fingerprint-Based Detection of Non-Local Effects in the Electronic Structure of a Simple Single Component Covalent System. Condensed Matter, 6(1), 9.
Parsaeifard, Behnam; Sankar De, Deb; Finkler, Jonas A. & Goedecker, Stefan
-
General-Purpose Machine Learning Potentials Capturing Nonlocal Charge Transfer. Accounts of Chemical Research, 54(4), 808-817.
Ko, Tsz Wai; Finkler, Jonas A.; Goedecker, Stefan & Behler, Jörg
-
Neural Network Potentials: A Concise Overview of Methods. Annual Review of Physical Chemistry, 73(1), 163-186.
Kocer, Emir; Ko, Tsz Wai & Behler, Jörg
-
Accurate Fourth-Generation Machine Learning Potentials by Electrostatic Embedding. Journal of Chemical Theory and Computation, 19(12), 3567-3579.
Ko, Tsz Wai; Finkler, Jonas A.; Goedecker, Stefan & Behler, Jörg
