{"id":13698104,"url":"https://github.com/atomistic-machine-learning/schnetpack","last_synced_at":"2025-10-21T19:43:31.917Z","repository":{"id":37849580,"uuid":"147224878","full_name":"atomistic-machine-learning/schnetpack","owner":"atomistic-machine-learning","description":"SchNetPack - Deep Neural Networks for Atomistic Systems","archived":false,"fork":false,"pushed_at":"2025-04-23T13:29:55.000Z","size":44173,"stargazers_count":840,"open_issues_count":6,"forks_count":226,"subscribers_count":30,"default_branch":"master","last_synced_at":"2025-04-23T14:32:11.768Z","etag":null,"topics":["condensed-matter","machine-learning","molecular-dynamics","neural-network","quantum-chemistry"],"latest_commit_sha":null,"homepage":null,"language":"Python","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"other","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/atomistic-machine-learning.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null,"zenodo":null}},"created_at":"2018-09-03T15:44:35.000Z","updated_at":"2025-04-18T06:26:09.000Z","dependencies_parsed_at":"2023-10-24T11:36:06.374Z","dependency_job_id":"bd9eb849-3363-4c28-8d27-5eb14d7e0964","html_url":"https://github.com/atomistic-machine-learning/schnetpack","commit_stats":{"total_commits":1163,"total_committers":34,"mean_commits":"34.205882352941174","dds":0.6723989681857265,"last_synced_commit":"7c8bd0376bb311c3f83665f45217a4e0ea15b302"},"previous_names":[],"tags_count":14,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/atomistic-machine-learning%2Fschnetpack","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/atomistic-machine-learning%2Fschnetpack/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/atomistic-machine-learning%2Fschnetpack/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/atomistic-machine-learning%2Fschnetpack/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/atomistic-machine-learning","download_url":"https://codeload.github.com/atomistic-machine-learning/schnetpack/tar.gz/refs/heads/master","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":252272944,"owners_count":21721831,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["condensed-matter","machine-learning","molecular-dynamics","neural-network","quantum-chemistry"],"created_at":"2024-08-02T19:00:40.086Z","updated_at":"2025-10-21T19:43:26.863Z","avatar_url":"https://github.com/atomistic-machine-learning.png","language":"Python","readme":"# SchNetPack - Deep Neural Networks for Atomistic Systems\n[![Code style: black](https://img.shields.io/badge/code%20style-black-000000.svg)](https://github.com/python/black)\n[![](https://shields.io/badge/-Lightning--Hydra--Template-017F2F?style=flat\u0026logo=github\u0026labelColor=303030)](https://github.com/hobogalaxy/lightning-hydra-template)\n\n\nSchNetPack is a toolbox for the development and application of deep neural networks to the prediction of potential energy surfaces and other quantum-chemical properties of molecules and materials. It contains basic building blocks of atomistic neural networks, manages their training and provides simple access to common benchmark datasets. This allows for an easy implementation and evaluation of new models.\n\nThe documentation can be found [here](https://schnetpack.readthedocs.io).\n\n##### Features\n\n- SchNet - an end-to-end continuous-filter CNN for molecules and materials [1-3]\n- PaiNN - equivariant message-passing for molecules and materials [4]\n- Output modules for dipole moments, polarizability, stress, and general response properties\n- Modules for electrostatics, Ewald summation, ZBL repulsion\n- GPU-accelerated molecular dynamics code incl. path-integral MD, thermostats, barostats\n\n## Installation\n\n### Install with pip\n\nThe simplest way to install SchNetPack is through pip which will automatically get the source code from PyPI:\n```\npip install schnetpack\n```\n\n### Install from source\n\nYou can also install the most recent code from our repository:\n\n```\ngit clone https://github.com/atomistic-machine-learning/schnetpack.git\ncd schnetpack\npip install .\n```\n\n### Visualization with Tensorboard\n\nSchNetPack supports multiple logging backends via PyTorch Lightning. The default logger is Tensorboard. SchNetPack also supports TensorboardX.\n\n\n## Getting started\n\nThe best place to get started is training a SchNetPack model on a common benchmark dataset via the command line\ninterface (CLI).\nWhen installing SchNetPack, the training script `spktrain` is added to your PATH.\nThe CLI uses [Hydra](https://hydra.cc/) and is based on the PyTorch Lightning/Hydra template that can be found\n[here](https://github.com/ashleve/lightning-hydra-template).\nThis enables a flexible configuration of the model, data and training process.\nTo fully take advantage of these features, it might be helpful to have a look at the Hydra and PyTorch Lightning docs.\n\n### Example 1: QM9\n\nIn the following, we focus on using the CLI to train on the QM9 dataset, but the same\nprocedure applies for the other benchmark datasets as well.\nFirst, create a working directory, where all data and runs will be stored:\n\n```\nmkdir spk_workdir\ncd spk_workdir\n```\n\nThen, the training of a SchNet model with default settings for QM9 can be started by:\n\n```\nspktrain experiment=qm9_atomwise\n```\n\nThe script prints the defaults for the experiment config `qm9_atomwise`.\nThe dataset will be downloaded automatically to `spk_workdir/data`, if it does not exist yet.\nThen, the training will be started.\n\nAll values of the config can be changed from the command line, including the directories for run and data.\nBy default, the model is stored in a directory with a unique run id hash as a subdirectory of `spk_workdir/runs`.\nThis can be changed as follows:\n\n```\nspktrain experiment=qm9_atomwise run.data_dir=/my/data/dir run.path=~/all_my_runs run.id=this_run\n```\n\nIf you call `spktrain experiment=qm9_atomwise --help`, you can see the full config with all the parameters\nthat can be changed.\nNested parameters can be changed as follows:\n\n```\nspktrain experiment=qm9_atomwise run.data_dir=\u003cpath\u003e data.batch_size=64\n```\n\nHydra organizes parameters in config groups which allows hierarchical configurations consisting of multiple\nyaml files. This allows to easily change the whole dataset, model or representation.\nFor instance, changing from the default SchNet representation to PaiNN, use:\n\n```\nspktrain experiment=qm9_atomwise run.data_dir=\u003cpath\u003e model/representation=painn\n```\n\nIt is a bit confusing at first when to use \".\" or \"/\". The slash is used, if you are loading a preconfigured config\ngroup, while the dot is used changing individual values. For example, the config group \"model/representation\"\ncorresponds to the following part of the config:\n\n```\n    model:\n      representation:\n        _target_: schnetpack.representation.PaiNN\n        n_atom_basis: 128\n        n_interactions: 3\n        shared_interactions: false\n        shared_filters: false\n        radial_basis:\n          _target_: schnetpack.nn.radial.GaussianRBF\n          n_rbf: 20\n          cutoff: ${globals.cutoff}\n        cutoff_fn:\n          _target_: schnetpack.nn.cutoff.CosineCutoff\n          cutoff: ${globals.cutoff}\n```\n\nIf you would want to additionally change some value of this group, you could use:\n\n```\nspktrain experiment=qm9_atomwise run.data_dir=\u003cpath\u003e model/representation=painn model.representation.n_interactions=5\n```\n\nFor more details on config groups, have a look at the\n[Hydra docs](https://hydra.cc/docs/tutorials/basic/your_first_app/config_groups/).\n\n\n### Example 2: Potential energy surfaces\n\nThe example above uses `AtomisticModel` internally, which is a\n`pytorch_lightning.LightningModule`, to predict single properties.\nThe following example will use the same class to predict potential energy surfaces,\nin particular energies with the appropriate derivates to obtain forces and stress tensors.\nThis works since the pre-defined configuration for the MD17 dataset,\nprovided from the command line by `experiment=md17`, is selecting the representation and output modules that\n`AtomisticModel` is using.\nA more detailed description of the configuration and how to build your custom configs can be\nfound [here](https://schnetpack.readthedocs.io/en/latest/userguide/configs.html).\n\nThe `spktrain` script can be used to train a model for a molecule from the MD17 datasets\n\n```\nspktrain experiment=md17 data.molecule=uracil\n```\n\nIn the case of MD17, reference calculations of energies and forces are available.\nTherefore, one needs to set weights for the losses of those properties.\nThe losses are defined as part of output definitions in the `task` config group:\n\n```\n    task:\n      outputs:\n        - _target_: schnetpack.task.ModelOutput\n          name: ${globals.energy_key}\n          loss_fn:\n            _target_: torch.nn.MSELoss\n          metrics:\n            mae:\n              _target_: torchmetrics.regression.MeanAbsoluteError\n            mse:\n              _target_: torchmetrics.regression.MeanSquaredError\n          loss_weight: 0.005\n        - _target_: schnetpack.task.ModelOutput\n          name: ${globals.forces_key}\n          loss_fn:\n            _target_: torch.nn.MSELoss\n          metrics:\n            mae:\n              _target_: torchmetrics.regression.MeanAbsoluteError\n            mse:\n              _target_: torchmetrics.regression.MeanSquaredError\n          loss_weight: 0.995\n```\n\nFor a training on *energies** and *forces*, we recommend to put a stronger\nweight on the loss of the force prediction during training.\nBy default, the loss weights are set to 0.005 for the energy and 0.995 for forces.\nThis can be changed as follow:\n\n```\nspktrain experiment=md17 data.molecule=uracil task.outputs.0.loss_weight=0.005 task.outputs.1.loss_weight=0.995\n```\n\n### Logging\n\nBeyond the output of the command line, SchNetPack supports multiple logging backends over PyTorch Lightning.\nBy default, the Tensorboard logger is activated.\nIf TensorBoard is installed, the results can be shown by calling:\n\n```\ntensorboard --logdir=\u003crundir\u003e\n```\n\nFurthermore, SchNetPack comes with configs for a CSV logger and [Aim](https://github.com/aimhubio/aim).\nThese can be selected as follows:\n\n```\nspktrain experiment=md17 logger=csv\n```\n\n## LAMMPS interface\n\nSchNetPack comes with an interface to LAMMPS. A detailed installation guide is linked in the [How-To section of our documentation](https://schnetpack.readthedocs.io/en/latest/howtos/lammps.html).\n\n## Extensions\n\nSchNetPack can be used as a base for implementations of advanced atomistic neural networks and training tasks.\nFor example, there exists an [extension package](https://github.com/atomistic-machine-learning/schnetpack-gschnet) called `schnetpack-gschnet` for the most recent version of cG-SchNet [5], a conditional generative model for molecules.\nIt demonstrates how a complex training task can be implemented in a few custom classes while leveraging the hierarchical configuration and automated training procedure of the SchNetPack framework.\n\n\n## Citation\n\nIf you are using SchNetPack in your research, please cite:\n\nK.T. Schütt, S.S.P. Hessmann, N.W.A. Gebauer, J. Lederer, M. Gastegger.\nSchNetPack 2.0: A neural network toolbox for atomistic machine learning.\nJ. Chem. Phys. 2023, 158 (14): 144801.\n[10.1063/5.0138367](https://doi.org/10.1063/5.0138367).\n\nK.T. Schütt, P. Kessel, M. Gastegger, K. Nicoli, A. Tkatchenko, K.-R. Müller.\nSchNetPack: A Deep Learning Toolbox For Atomistic Systems.\nJ. Chem. Theory Comput. 2019, 15 (1): 448-455.\n[10.1021/acs.jctc.8b00908](http://dx.doi.org/10.1021/acs.jctc.8b00908).\n\n    @article{schutt2023schnetpack,\n        author = {Sch{\\\"u}tt, Kristof T. and Hessmann, Stefaan S. P. and Gebauer, Niklas W. A. and Lederer, Jonas and Gastegger, Michael},\n        title = \"{SchNetPack 2.0: A neural network toolbox for atomistic machine learning}\",\n        journal = {The Journal of Chemical Physics},\n        volume = {158},\n        number = {14},\n        pages = {144801},\n        year = {2023},\n        month = {04},\n        issn = {0021-9606},\n        doi = {10.1063/5.0138367},\n        url = {https://doi.org/10.1063/5.0138367},\n        eprint = {https://pubs.aip.org/aip/jcp/article-pdf/doi/10.1063/5.0138367/16825487/144801\\_1\\_5.0138367.pdf},\n    }\n    @article{schutt2019schnetpack,\n        author = {Sch{\\\"u}tt, Kristof T. and Kessel, Pan and Gastegger, Michael and Nicoli, Kim A. and Tkatchenko, Alexandre and Müller, Klaus-Robert},\n        title = \"{SchNetPack: A Deep Learning Toolbox For Atomistic Systems}\",\n        journal = {Journal of Chemical Theory and Computation},\n        volume = {15},\n        number = {1},\n        pages = {448-455},\n        year = {2019},\n        doi = {10.1021/acs.jctc.8b00908},\n        URL = {https://doi.org/10.1021/acs.jctc.8b00908},\n        eprint = {https://doi.org/10.1021/acs.jctc.8b00908},\n    }\n\n\n\n## Acknowledgements\n\nCLI and hydra configs for PyTorch Lightning are adapted from this template: [![](https://shields.io/badge/-Lightning--Hydra--Template-017F2F?style=flat\u0026logo=github\u0026labelColor=303030)](https://github.com/hobogalaxy/lightning-hydra-template)\n\n\n## References\n\n* [1] K.T. Schütt. F. Arbabzadah. S. Chmiela, K.-R. Müller, A. Tkatchenko.\n*Quantum-chemical insights from deep tensor neural networks.*\nNature Communications **8**. 13890 (2017) [10.1038/ncomms13890](http://dx.doi.org/10.1038/ncomms13890)\n\n* [2] K.T. Schütt. P.-J. Kindermans, H. E. Sauceda, S. Chmiela, A. Tkatchenko, K.-R. Müller.\n*SchNet: A continuous-filter convolutional neural network for modeling quantum interactions.*\nAdvances in Neural Information Processing Systems 30, pp. 992-1002 (2017) [Paper](http://papers.nips.cc/paper/6700-schnet-a-continuous-filter-convolutional-neural-network-for-modeling-quantum-interactions)\n\n* [3] K.T. Schütt. P.-J. Kindermans, H. E. Sauceda, S. Chmiela, A. Tkatchenko, K.-R. Müller.\n*SchNet - a deep learning architecture for molecules and materials.*\nThe Journal of Chemical Physics 148(24), 241722 (2018) [10.1063/1.5019779](https://doi.org/10.1063/1.5019779)\n\n* [4] K. T. Schütt, O. T. Unke, M. Gastegger\n*Equivariant message passing for the prediction of tensorial properties and molecular spectra.*\nInternational Conference on Machine Learning (pp. 9377-9388). PMLR, [Paper](https://proceedings.mlr.press/v139/schutt21a.html).\n\n* [5] N. W. A. Gebauer, M. Gastegger, S. S. P. Hessmann, K.-R. Müller, K. T. Schütt\n*Inverse design of 3d molecular structures with conditional generative neural networks.*\nNature Communications **13**. 973 (2022) [10.1038/s41467-022-28526-y](https://doi.org/10.1038/s41467-022-28526-y)\n","funding_links":[],"categories":["Software","programs, ML","Representation Learning","Machine Learning"],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fatomistic-machine-learning%2Fschnetpack","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fatomistic-machine-learning%2Fschnetpack","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fatomistic-machine-learning%2Fschnetpack/lists"}