Interatomic potentials can be written as aseries expansion offunctional terms that depend on the position of one, two, three, etc.atoms at a time. Then the total potential of the system canbe written as[3]
Here is the one-body term, the two-body term, thethree body term, the number of atoms in the system, the position of atom, etc., and areindicesthat loop over atom positions.
Note that in case the pair potential is given per atom pair, in the two-bodyterm the potential should be multiplied by 1/2 as otherwise each bond is countedtwice, and similarly the three-body term by 1/6.[3] Alternatively,thesummation of the pair term can be restricted to casesand similarly for the three-body term, ifthe potential form is such that it is symmetric with respect to exchangeof the and indices (this may not be the case for potentialsfor multielemental systems).
The one-body term is only meaningful if the atoms are in an externalfield (e.g. anelectric field). In the absence of external fields,the potential should not depend on the absolute position ofatoms, but only on the relative positions. This meansthat the functional form can be rewritten as afunctionof interatomic distancesand angles between the bonds(vectors to neighbours).Then, in the absence of external forces, the generalform becomes
In the three-body term theinteratomic distance is not neededsince the three termsare sufficient to give the relative positions of three atoms in three-dimensional space. Any terms of order higher than2 are also calledmany-body potentials.In some interatomic potentials the many-body interactions are embedded into the terms of a pair potential (see discussion onEAM-like and bond order potentials below).
In principle the sums in the expressions run over all atoms.However, if the range of the interatomic potential is finite,i.e. the potentials abovesome cutoff distance,the summing can be restricted to atoms within the cutoffdistance of each other. By also using a cellular methodfor finding the neighbours,[1] the MD algorithm can beanO(N) algorithm. Potentials with an infiniterange can be summed up efficiently byEwald summationand its further developments.
The forces acting between atoms can be obtained bydifferentiation ofthe total energy with respect to atom positions. That is,to get the force on atom one should take the three-dimensionalderivative (gradient) of the potential with respect to the position of atom:
For two-body potentials this gradient reduces, thanks to thesymmetry with respect to in the potential form, to straightforwarddifferentiation with respect to the interatomic distances. However, for many-bodypotentials (three-body, four-body, etc.) the differentiationbecomes considerably more complex[12][13]since the potential may not be any longer symmetric with respect to exchange.In other words, also the energyof atoms that are not direct neighbours of can depend on the positionbecause of angular and other many-body terms, and hence contribute to the gradient.
Interatomic potentials come in many different varieties, withdifferent physical motivations. Even for single well-known elements such assilicon, a wide variety of potentials quite different in functional form and motivation have been developed.[14]The true interatomic interactionsarequantum mechanical in nature, and there is no knownway in which the true interactions described bytheSchrödinger equation orDirac equation forall electrons and nuclei could be cast into an analyticalfunctional form. Hence all analytical interatomicpotentials are by necessityapproximations.
Over time interatomic potentials have largely grown more complex and more accurate, although this is not strictly true.[15] This has included both increased descriptions of physics, as well as added parameters. Until recently, all interatomic potentials could be described as "parametric", having been developed and optimized with a fixed number of (physical) terms and parameters. New research focuses instead onnon-parametric potentials which can be systematically improvable by using complexlocal atomic neighbor descriptors and separate mappings to predict system properties, such that the total number of terms and parameters are flexible.[16] These non-parametric models can be significantly more accurate, but since they are not tied to physical forms and parameters, there are many potential issues surrounding extrapolation and uncertainties.
where is the depth of thepotential welland is the distance at which the potential crosses zero.The attractive term proportional to in the potential comes from the scaling ofvan der Waals forces, while the repulsive term is much more approximate (conveniently the square of the attractive term).[6] On its own, this potential is quantitatively accurate only for noble gases and has been extensively studied in the past decades,[19] but is also widely used for qualitative studies and in systems where dipole interactions are significant, particularly inchemistry force fields to describe intermolecular interactions - especially in fluids.[20]
Another simple and widely used pair potential is theMorse potential, which consists simply of a sum of two exponentials.
Here is theequilibrium bond energy and the bond distance. The Morsepotential has been applied to studies of molecular vibrations and solids,[21] and also inspired the functional form of more accurate potentials such as the bond-order potentials.
Ionic materials are often described by a sum of a short-range repulsive term, such as theBuckingham pair potential, and a long-rangeCoulomb potentialgiving the ionic interactions between the ions forming the material. The short-rangeterm for ionic materials can also be of many-body character.[22]
Pair potentials have some inherent limitations, such as the inabilityto describe all 3elastic constants ofcubic metals or correctly describe both cohesive energy and vacancy formation energy.[7] Therefore, quantitativemolecular dynamics simulations are carried out with various of many-body potentials.
For very short interatomic separations, important inradiation material science,the interactions can be described quite accurately with screenedCoulomb potentials which have the general form
Here, when. and are the charges of the interacting nuclei, and is the so-called screening parameter.A widely used popular screening function is the "Universal ZBL" one.[23]and more accurate ones can be obtained from all-electron quantum chemistry calculations[24][25]In a comparative study of several quantum chemistry methods, it was shown that pair-specific "NLH" repulsive potentials with a simple three-exponential screening function are accurate to within ~2% above 30 eV, while the universal ZBL potential differs by ~5%–10% from the quantum chemical calculations above 100 eV.[25] Inbinary collision approximation simulations this kind of potential can be usedto describe thenuclear stopping power.
The Stillinger-Weber potential[26] is a potential that has a two-body and three-body terms of the standard form
where the three-body term describes how the potential energy changes with bond bending.It was originally developed for pure Si, but has been extended to many otherelements and compounds[27][28] and also formed the basis for other Si potentials.[29][30]
Metals are very commonly described with what can be called"EAM-like" potentials, i.e. potentials that sharethe same functional form as theembedded atom model.In these potentials, the total potential energy is written
where is a so-called embedding function(not to be confused with the force) that is a function of the sum of the so-called electron density.is a pair potential that usually is purely repulsive. In the originalformulation[31][32] the electrondensity function was obtainedfrom true atomic electron densities, and the embedding functionwas motivated fromdensity-functional theory as the energy neededto 'embed' an atom into the electron density. .[33]However, many other potentials used for metals share the same functionalform but motivate the terms differently, e.g. basedontight-binding theory[34][35][36]or other motivations[37][38].[39]
EAM-like potentials are usually implemented as numerical tables.A collection of tables is available at the interatomicpotential repository at NIST[1]
Covalently bonded materials are often described bybond order potentials, sometimes also calledTersoff-like or Brenner-like potentials.[10][40][41]
These have in general a form that resembles a pair potential:
where the repulsive and attractive part are simple exponentialfunctions similar to those in the Morse potential.However, thestrength is modified by the environment of the atom via the term. If implemented withoutan explicit angular dependence, these potentialscan be shown to be mathematically equivalent to some varieties of EAM-like potentials[42][43]Thanks to this equivalence, the bond-order potential formalism has been implemented also for many metal-covalent mixed materials.[43][44][45][46]
EAM potentials have also been extended to describe covalent bonding by adding angular-dependent terms to the electron density function, in what is called the modified embedded atom method (MEAM).[47][48][49]
Aforce field is the collection of parameters to describe the physical interactions between atoms or physical units (up to ~108) using a given energy expression. The term force field characterizes the collection of parameters for a given interatomic potential (energy function) and is often used within thecomputational chemistry community.[50] The force field parameters make the difference between good and poor models. Force fields are used for the simulation of metals, ceramics, molecules, chemistry, and biological systems, covering the entire periodic table and multiphase materials. Today's performance is among the best for solid-state materials,[51][52] molecular fluids,[20] and for biomacromolecules,[53] whereby biomacromolecules were the primary focus of force fields from the 1970s to the early 2000s. Force fields range from relatively simple and interpretable fixed-bond models (e.g. Interface force field,[50]CHARMM,[54] and COMPASS) to explicitly reactive models with many adjustable fit parameters (e.g.ReaxFF) and machine learning models.
It should first be noted that non-parametric potentials are often referred to as "machine learning" potentials. While the descriptor/mapping forms of non-parametric models are closely related to machine learning in general and their complex nature make machine learning fitting optimizations almost necessary, differentiation is important in that parametric models can also be optimized using machine learning.
Current research in interatomic potentials involves using systematically improvable, non-parametric mathematical forms and increasingly complexmachine learning methods. The total energy is then writtenwhere is a mathematical representation of the atomic environment surrounding the atom, known as thedescriptor.[55] is a machine-learning model that provides a prediction for the energy of atom based on the descriptor output. An accurate machine-learning potential requires both a robust descriptor and a suitable machine learning framework. The simplest descriptor is the set of interatomic distances from atom to its neighbours, yielding a machine-learned pair potential. However, more complex many-body descriptors are needed to produce highly accurate potentials.[55] It is also possible to use a linear combination of multiple descriptors with associated machine-learning models.[56] Potentials have been constructed using a variety of machine-learning methods, descriptors, and mappings, includingneural networks,[57]Gaussian process regression,[58][59] andlinear regression.[60][16]
A non-parametric potential is most often trained to total energies, forces, and/or stresses obtained from quantum-level calculations, such asdensity functional theory, as with most modern potentials. However, the accuracy of a machine-learning potential can be converged to be comparable with the underlying quantum calculations, unlike analytical models. Hence, they are in general more accurate than traditional analytical potentials, but they are correspondingly less able to extrapolate. Further, owing to the complexity of the machine-learning model and the descriptors, they are computationally far more expensive than their analytical counterparts.
Non-parametric, machine learned potentials may also be combined with parametric, analytical potentials, for example to include known physics such as the screened Coulomb repulsion,[61] or to impose physical constraints on the predictions.[62]
Since the interatomic potentials are approximations, they by necessity all involveparameters that need to be adjusted to some reference values. In simplepotentials such as the Lennard-Jones and Morse ones, the parameters are interpretable and can be set to match e.g. the equilibriumbond length and bond strengthof a dimer molecule or thesurface energy of a solid.[63][64] Lennard-Jones potential can typically describe the lattice parameters, surface energies, and approximate mechanical properties.[65] Many-bodypotentials often contain tens or even hundreds of adjustable parameters with limited interpretability and no compatibility with common interatomic potentials for bonded molecules.Such parameter sets can be fit to a larger set of experimental data, or materialsproperties derived from less reliable data such as fromdensity-functional theory.[66][67] For solids, a many-body potentialcan often describe thelattice constant of the equilibrium crystal structure, thecohesive energy, andlinear elastic constants, as well as basicpoint defect properties of all the elements and stable compounds well, although deviations in surface energies often exceed 50%.[30][43][45][46][65][50][68][69][70]Non-parametric potentials in turn contain hundreds or even thousands of independent parameters to fit. For any but the simplest model forms, sophisticated optimization and machine learning methods are necessary for useful potentials.
The aim of most potential functions and fitting is to make the potentialtransferable, i.e. that it can describe materials properties that are clearlydifferent from those it was fitted to (for examples of potentials explicitly aiming for this,see e.g.[71][72][73][74][75]). Key aspects here are the correct representation of chemical bonding, validation of structures and energies, as well as interpretability of all parameters.[51] Full transferability and interpretability is reached with the Interface force field (IFF).[50] An example of partial transferability, a review of interatomic potentialsof Si describes that Stillinger-Weber and Tersoff III potentials for Si can describe several (but not all) materials properties they were not fitted to.[14]
The NIST interatomic potential repository provides a collection of fitted interatomic potentials, either as fitted parameter values or numericaltables of the potential functions.[76] The OpenKIM[77] project also provides a repository of fitted potentials, along with collections of validation tests and a software framework for promoting reproducibility in molecular simulations using interatomic potentials.
Since the 1990s, machine learning programs have been employed to construct interatomic potentials, mapping atomic structures to their potential energies. These are generally referred to as 'machine learning potentials' (MLPs)[78] or as 'machine-learned interatomic potentials' (MLIPs).[79] Such machine learning potentials help fill the gap between highly accurate but computationally intensive simulations likedensity functional theory and computationally lighter, but much less precise, empirical potentials. Early neural networks showed promise, but their inability to systematically account for interatomic energy interactions limited their applications to smaller, low-dimensional systems, keeping them largely within the confines of academia. However, with continuous advancements in artificial intelligence technology, machine learning methods have become significantly more accurate, increasing the use of machine learning in the field.[80][81][79]
Modern neural networks have revolutionized the construction of highly accurate and computationally light potentials by integrating theoretical understanding of materials science into their architectures and preprocessing. Almost all are local, accounting for all interactions between an atom and its neighbor up to some cutoff radius. These neural networks usually intake atomic coordinates and output potential energies. Atomic coordinates are sometimes transformed with atom-centered symmetry functions or pair symmetry functions before being fed into neural networks. Encoding symmetry has been pivotal in enhancing machine learning potentials by drastically constraining the neural networks' search space.[80][82]
Conversely, message-passing neural networks (MPNNs), a form of graph neural networks, learn their own descriptors and symmetry encodings. They treat molecules as three-dimensionalgraphs and iteratively update each atom's feature vectors as information about neighboring atoms is processed through message functions and convolutions. These feature vectors are then used to directly predict the final potentials. In 2017, the first-ever MPNN model, a deep tensor neural network, was used to calculate the properties of small organic molecules.[83][80][84]
Another class of machine-learned interatomic potential is the Gaussian approximation potential (GAP),[85][86][87] which combines compact descriptors of local atomic environments[88] with Gaussian process regression[89] to machine learn thepotential energy surface of a given system. To date, the GAP framework has been used to successfully develop a number of MLIPs for various systems, including for elemental systems such as Carbon[90][91] Silicon,[92] and Tungsten,[93] as well as for multicomponent systems such as Ge2Sb2Te5[94] and austeniticstainless steel, Fe7Cr2Ni.[95]
Classical interatomic potentials often exceed the accuracy of simplified quantum mechanical methods such asdensity functional theory at a million times lower computational cost.[51] The use of interatomic potentials is recommended for the simulation of nanomaterials, biomacromolecules, and electrolytes from atoms up to millions of atoms at the 100 nm scale and beyond. As a limitation, electron densities and quantum processes at the local scale of hundreds of atoms are not included. When of interest, higher levelquantum chemistry methods can be locally used.[96]
The robustness of a model at different conditions other than those used in the fitting process is often measured in terms of transferability of the potential.
^J. F. Ziegler, J. P. Biersack, and U. Littmark. The Stopping and Range of Ions in Matter. Pergamon, New York, 1985.
^Nordlund, K.; Runeberg, N.; Sundholm, D. (1997). "Repulsive interatomic potentials calculated using Hartree-Fock and density-functional theory methods".Nuclear Instruments and Methods in Physics Research Section B: Beam Interactions with Materials and Atoms.132 (1):45–54.Bibcode:1997NIMPB.132...45N.doi:10.1016/s0168-583x(97)00447-3.ISSN0168-583X.
^Ichimura, M. (16 February 1996). "Stillinger-Weber potentials for III–V compound semiconductors and their application to the critical thickness calculation for InAs/GaAs".Physica Status Solidi A.153 (2):431–437.Bibcode:1996PSSAR.153..431I.doi:10.1002/pssa.2211530217.ISSN0031-8965.
^abJuslin, N.; Erhart, P.; Träskelin, P.; Nord, J.; Henriksson, K. O. E.; Nordlund, K.; Salonen, E.; Albe, K. (15 December 2005). "Analytical interatomic potential for modeling nonequilibrium processes in the W–C–H system".Journal of Applied Physics.98 (12): 123520–123520–12.Bibcode:2005JAP....98l3520J.doi:10.1063/1.2149492.ISSN0021-8979.S2CID8090449.
^abcdHeinz H, Lin TJ, Mishra RK, Emami FS (February 2013). "Thermodynamically consistent force fields for the assembly of inorganic, organic, and biological nanostructures: the INTERFACE force field".Langmuir.29 (6):1754–65.doi:10.1021/la3038846.PMID23276161.
^abcHeinz H, Ramezani-Dakhel H (January 2016). "Simulations of inorganic-bioorganic interfaces to discover new materials: insights, comparisons to experiment, challenges, and opportunities".Chemical Society Reviews.45 (2):412–48.doi:10.1039/c5cs00890e.PMID26750724.
^Heinz, Hendrik; Vaia, R. A.; Farmer, B. L.; Naik, R. R. (2008-10-09). "Accurate Simulation of Surfaces and Interfaces of Face-Centered Cubic Metals Using 12−6 and 9−6 Lennard-Jones Potentials".The Journal of Physical Chemistry C.112 (44):17281–17290.doi:10.1021/jp801931d.ISSN1932-7447.
^Liu, Juan; Tennessen, Emrys; Miao, Jianwei; Huang, Yu; Rondinelli, James M.; Heinz, Hendrik (2018-05-31). "Understanding Chemical Bonding in Alloys and the Representation in Atomistic Simulations".The Journal of Physical Chemistry C.122 (26):14996–15009.doi:10.1021/acs.jpcc.8b01891.ISSN1932-7447.S2CID51855788.
^Mishra, Ratan K.; Flatt, Robert J.; Heinz, Hendrik (2013-04-19). "Force Field for Tricalcium Silicate and Insight into Nanoscale Properties: Cleavage, Initial Hydration, and Adsorption of Organic Molecules".The Journal of Physical Chemistry C.117 (20):10417–10432.doi:10.1021/jp312815g.ISSN1932-7447.
^Ramezani-Dakhel, Hadi; Ruan, Lingyan; Huang, Yu; Heinz, Hendrik (2015-01-21). "Molecular Mechanism of Specific Recognition of Cubic Pt Nanocrystals by Peptides and of the Concentration-Dependent Formation from Seed Crystals".Advanced Functional Materials.25 (9):1374–1384.Bibcode:2015AdvFM..25.1374R.doi:10.1002/adfm.201404136.ISSN1616-301X.S2CID94001655.
^Blank, TB; Brown, SD; Calhoun, AW; Doren, DJ (1995). "Neural network models of potential energy surfaces".The Journal of Chemical Physics.103 (10):4129–37.Bibcode:1995JChPh.103.4129B.doi:10.1063/1.469597.
^Rasmussen, Carl Edward; Williams, Christopher K. I. (2008).Gaussian processes for machine learning. Adaptive computation and machine learning (3. print ed.). Cambridge, Mass.: MIT Press.ISBN978-0-262-18253-9.