File Download

There are no files associated with this item.

  Links for fulltext
     (May Require Subscription)
Supplementary

Article: GPUMD: A package for constructing accurate machine-learned potentials and performing highly efficient atomistic simulations

TitleGPUMD: A package for constructing accurate machine-learned potentials and performing highly efficient atomistic simulations
Authors
Issue Date21-Sep-2022
PublisherAmerican Institute of Physics
Citation
The Journal of Chemical Physics, 2022, v. 157, n. 11 How to Cite?
AbstractWe present our latest advancements of machine-learned potentials (MLPs) based on the neuroevolution potential (NEP) framework introduced in [Fan et al., Phys. Rev. B 104, 104309 (2021)] and their implementation in the open-source package GPUMD. We increase the accuracy of NEP models both by improving the radial functions in the atomic-environment descriptor using a linear combination of Chebyshev basis functions and by extending the angular descriptor with some four-body and five-body contributions as in the atomic cluster expansion approach. We also detail our efficient implementation of the NEP approach in graphics processing units as well as our workflow for the construction of NEP models, and we demonstrate their application in large-scale atomistic simulations. By comparing to state-of-the-art MLPs, we show that the NEP approach not only achieves above-average accuracy but also is far more computationally efficient. These results demonstrate that the GPUMD package is a promising tool for solving challenging problems requiring highly accurate, large-scale atomistic simulations. To enable the construction of MLPs using a minimal training set, we propose an active-learning scheme based on the latent space of a pre-trained NEP model. Finally, we introduce three separate Python packages, GPYUMD, CALORINE, and PYNEP, which enable the integration of GPUMD into Python workflows.
Persistent Identifierhttp://hdl.handle.net/10722/331091
ISSN
2021 Impact Factor: 4.304
2020 SCImago Journal Rankings: 1.071
ISI Accession Number ID

 

DC FieldValueLanguage
dc.contributor.authorFan, ZY-
dc.contributor.authorWang, YZ-
dc.contributor.authorYing, PH-
dc.contributor.authorSong, KK-
dc.contributor.authorWang, JJ-
dc.contributor.authorWang, Y-
dc.contributor.authorZeng, ZZ-
dc.contributor.authorKe, X-
dc.contributor.authorLindgren, E-
dc.contributor.authorRahm, JM-
dc.contributor.authorGabourie, AJ-
dc.contributor.authorLiu, JH-
dc.contributor.authorDong, HK-
dc.contributor.authorWu, JY-
dc.contributor.authorYue, C-
dc.contributor.authorZheng, Z-
dc.contributor.authorJian, S-
dc.contributor.authorErhart, P-
dc.contributor.authorSu, YJ-
dc.contributor.authorAla-Nissila, T-
dc.date.accessioned2023-09-21T06:52:40Z-
dc.date.available2023-09-21T06:52:40Z-
dc.date.issued2022-09-21-
dc.identifier.citationThe Journal of Chemical Physics, 2022, v. 157, n. 11-
dc.identifier.issn0021-9606-
dc.identifier.urihttp://hdl.handle.net/10722/331091-
dc.description.abstractWe present our latest advancements of machine-learned potentials (MLPs) based on the neuroevolution potential (NEP) framework introduced in [Fan et al., Phys. Rev. B 104, 104309 (2021)] and their implementation in the open-source package GPUMD. We increase the accuracy of NEP models both by improving the radial functions in the atomic-environment descriptor using a linear combination of Chebyshev basis functions and by extending the angular descriptor with some four-body and five-body contributions as in the atomic cluster expansion approach. We also detail our efficient implementation of the NEP approach in graphics processing units as well as our workflow for the construction of NEP models, and we demonstrate their application in large-scale atomistic simulations. By comparing to state-of-the-art MLPs, we show that the NEP approach not only achieves above-average accuracy but also is far more computationally efficient. These results demonstrate that the GPUMD package is a promising tool for solving challenging problems requiring highly accurate, large-scale atomistic simulations. To enable the construction of MLPs using a minimal training set, we propose an active-learning scheme based on the latent space of a pre-trained NEP model. Finally, we introduce three separate Python packages, GPYUMD, CALORINE, and PYNEP, which enable the integration of GPUMD into Python workflows.-
dc.languageeng-
dc.publisherAmerican Institute of Physics-
dc.relation.ispartofThe Journal of Chemical Physics-
dc.rightsThis work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.-
dc.titleGPUMD: A package for constructing accurate machine-learned potentials and performing highly efficient atomistic simulations-
dc.typeArticle-
dc.identifier.doi10.1063/5.0106617-
dc.identifier.pmid36137808-
dc.identifier.scopuseid_2-s2.0-85138439619-
dc.identifier.volume157-
dc.identifier.issue11-
dc.identifier.eissn1089-7690-
dc.identifier.isiWOS:000861198300001-
dc.publisher.placeMELVILLE-
dc.identifier.issnl0021-9606-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats