File Download

There are no files associated with this item.

  Links for fulltext
     (May Require Subscription)
Supplementary

Article: Accurate Neural Network Fine-Tuning Approach for Transferable Ab Initio Energy Prediction across Varying Molecular and Crystalline Scales

TitleAccurate Neural Network Fine-Tuning Approach for Transferable Ab Initio Energy Prediction across Varying Molecular and Crystalline Scales
Authors
Issue Date1-Jan-2025
PublisherAmerican Chemical Society
Citation
Journal of Chemical Theory and Computation, 2025 How to Cite?
AbstractExisting machine learning models attempt to predict the energies of large molecules by training small molecules, but eventually fail to retain high accuracy as the errors increase with system size. Through an orbital pairwise decomposition of the correlation energy, a pretrained neural network model on hundred-scale data containing small molecules is demonstrated to be sufficiently transferable for accurately predicting large systems, including molecules and crystals. Our model introduces a residual connection to explicitly learn the pairwise energy corrections, and employs various low-rank retraining techniques to modestly adjust the learned network parameters. We demonstrate that with as few as only one larger molecule retraining the base model originally trained on only small molecules of (H2O)6, the MP2 correlation energy of the large liquid water (H2O)64 in a periodic supercell can be predicted at chemical accuracy. Similar performance is observed for large protonated clusters and periodic poly glycine chains. A demonstrative application is presented to predict the energy ordering of symmetrically inequivalent sublattices for distinct hydrogen orientations in the ice XV phase. Our work represents an important step forward in the quest for cost-effective, highly accurate and transferable neural network models in quantum chemistry, bridging the electronic structure patterns between small and large systems.
Persistent Identifierhttp://hdl.handle.net/10722/354566
ISSN
2023 Impact Factor: 5.7
2023 SCImago Journal Rankings: 1.457
ISI Accession Number ID

 

DC FieldValueLanguage
dc.contributor.authorNg, Wai Pan-
dc.contributor.authorZhang, Zili-
dc.contributor.authorYang, Jun-
dc.date.accessioned2025-02-17T00:35:10Z-
dc.date.available2025-02-17T00:35:10Z-
dc.date.issued2025-01-01-
dc.identifier.citationJournal of Chemical Theory and Computation, 2025-
dc.identifier.issn1549-9618-
dc.identifier.urihttp://hdl.handle.net/10722/354566-
dc.description.abstractExisting machine learning models attempt to predict the energies of large molecules by training small molecules, but eventually fail to retain high accuracy as the errors increase with system size. Through an orbital pairwise decomposition of the correlation energy, a pretrained neural network model on hundred-scale data containing small molecules is demonstrated to be sufficiently transferable for accurately predicting large systems, including molecules and crystals. Our model introduces a residual connection to explicitly learn the pairwise energy corrections, and employs various low-rank retraining techniques to modestly adjust the learned network parameters. We demonstrate that with as few as only one larger molecule retraining the base model originally trained on only small molecules of (H2O)6, the MP2 correlation energy of the large liquid water (H2O)64 in a periodic supercell can be predicted at chemical accuracy. Similar performance is observed for large protonated clusters and periodic poly glycine chains. A demonstrative application is presented to predict the energy ordering of symmetrically inequivalent sublattices for distinct hydrogen orientations in the ice XV phase. Our work represents an important step forward in the quest for cost-effective, highly accurate and transferable neural network models in quantum chemistry, bridging the electronic structure patterns between small and large systems.-
dc.languageeng-
dc.publisherAmerican Chemical Society-
dc.relation.ispartofJournal of Chemical Theory and Computation-
dc.rightsThis work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.-
dc.titleAccurate Neural Network Fine-Tuning Approach for Transferable Ab Initio Energy Prediction across Varying Molecular and Crystalline Scales-
dc.typeArticle-
dc.identifier.doi10.1021/acs.jctc.4c01261-
dc.identifier.scopuseid_2-s2.0-85217015018-
dc.identifier.eissn1549-9626-
dc.identifier.isiWOS:001413179800001-
dc.identifier.issnl1549-9618-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats