{"id":13698710,"url":"https://github.com/XanaduAI/GradDFT","last_synced_at":"2025-05-04T03:32:03.628Z","repository":{"id":196576174,"uuid":"641029702","full_name":"XanaduAI/GradDFT","owner":"XanaduAI","description":"GradDFT is a JAX-based library enabling the differentiable design and experimentation of exchange-correlation functionals using machine learning techniques.","archived":false,"fork":false,"pushed_at":"2024-02-13T16:05:53.000Z","size":189338,"stargazers_count":86,"open_issues_count":12,"forks_count":9,"subscribers_count":5,"default_branch":"main","last_synced_at":"2025-04-10T03:55:20.332Z","etag":null,"topics":["deep-learning","density-functional-theory","differentiable-computing","flax","jax","pyscf","python","quantum-chemistry","scientific-computing"],"latest_commit_sha":null,"homepage":"","language":"Python","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"apache-2.0","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/XanaduAI.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null}},"created_at":"2023-05-15T16:18:25.000Z","updated_at":"2025-04-09T12:12:30.000Z","dependencies_parsed_at":null,"dependency_job_id":"ffe538a4-ad5a-44a8-9343-4c4a59bd3349","html_url":"https://github.com/XanaduAI/GradDFT","commit_stats":{"total_commits":363,"total_committers":2,"mean_commits":181.5,"dds":0.3471074380165289,"last_synced_commit":"ff00093fd9b2ae96d43c892685d5c40721a64d6b"},"previous_names":["xanaduai/graddft"],"tags_count":0,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/XanaduAI%2FGradDFT","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/XanaduAI%2FGradDFT/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/XanaduAI%2FGradDFT/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/XanaduAI%2FGradDFT/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/XanaduAI","download_url":"https://codeload.github.com/XanaduAI/GradDFT/tar.gz/refs/heads/main","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":252283728,"owners_count":21723533,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["deep-learning","density-functional-theory","differentiable-computing","flax","jax","pyscf","python","quantum-chemistry","scientific-computing"],"created_at":"2024-08-02T19:00:52.110Z","updated_at":"2025-05-04T03:31:58.619Z","avatar_url":"https://github.com/XanaduAI.png","language":"Python","readme":"\u003cdiv align=\"center\"\u003e\n\n# Grad DFT: a software library for machine learning enhanced density functional theory\n\n![Light Theme Image](media/README/light_logo.svg#gh-light-mode-only)\n\n![Dark Theme Image](media/README/dark_logo.svg#gh-dark-mode-only)\n\n[![build](https://img.shields.io/badge/build-passing-graygreen.svg \"https://github.com/XanaduAI/GradDFT/actions\")](https://github.com/XanaduAI/GradDFT/actions)\n[![arXiv](http://img.shields.io/badge/arXiv-2309.15127-B31B1B.svg \"Grad-DFT\")](https://arxiv.org/abs/2309.15127)\n[![arXiv](http://img.shields.io/badge/2024-JCP-273c75.svg \"Grad-DFT\")](https://doi.org/10.1063/5.0181037)\n[![License](https://img.shields.io/badge/License-Apache%202.0-9F9F9F \"https://github.com/XanaduAI/GradDFT/blob/main/LICENSE\")](https://github.com/XanaduAI/GradDFT/blob/main/LICENSE)\n\n\u003c/div\u003e\n\nGrad DFT is a JAX-based library enabling the differentiable design and experimentation of exchange-correlation functionals using machine learning techniques. The library provides significant functionality, including (but not limited to) training neural functionals with fully differentiable and just-in-time compilable self-consistent-field loops, direct optimization of the Kohn-Sham orbitals, and implementation of many of the known constraints of the exact functional.\n\n## Functionality\n\nThe current version of the library has the following capabilities:\n\n* Create any `NeuralFunctional` that follows the expression\n\n```math\nE_{xc,\\theta} = \\int d\\mathbf{r} \\mathbf{c}_\\theta[\\rho](\\mathbf{r})\\cdot\\mathbf{e}[\\rho](\\mathbf{r}),\n```\n\nthat is, under the locality assumption.\n\n* Include (non-differentiable) range-separated Hartree Fock components.\n* Train neural functionals using fully differentiable and just-in-time (jit) compilable self-consistent iterative procedures.\n* Perform DFT simulations with neural functionals using differentiable and just-in-time compilable [direct optimization of the Kohn-Sham orbitals](https://openreview.net/forum?id=aBWnqqsuot7).\n* Train neural functionals using loss functions that include contributions from the total energy, density, or both.\n* Include regularization terms that prevent the divergence of the self-consistent iterative procedure for non-self-consistently trained functionals. This includes the regularization term suggested in the supplementary material of [DM21](https://www.science.org/doi/full/10.1126/science.abj6511).\n* Use [15 constraints of the exact functional](https://www.annualreviews.org/doi/abs/10.1146/annurev-physchem-062422-013259) which can be added to existing loss functions.\n* Train with the [Harris functional](https://en.wikipedia.org/wiki/Harris_functional) for higher accuracy non-self consistent training.\n* Design neural functionals with a library of energy densities used in well-known functionals such as [B3LYP](https://pubs.acs.org/doi/abs/10.1021/j100096a001) and [DM21](https://www.science.org/doi/full/10.1126/science.abj6511).\n* Include simple DFT-D dispersion tails with a neural parametrization.\n\nFuture capabilities will include [sharding](https://jax.readthedocs.io/en/latest/notebooks/Distributed_arrays_and_automatic_parallelization.html) the training on HPC systems and the implementation of periodic boundary conditions for training neural functionals designed for condensed matter systems.\n\n## Install\n\nA core dependency of Grad DFT is [PySCF](https://pyscf.org). To successfully install this package in the forthcoming installation with `pip`, please ensure that `cmake` is installed and that\n\n```bash\nwhich cmake\n```\n\nreturns the correct path to the `cmake` binary. For instructions on installing `cmake`, visit https://cmake.org.\n\nNow, in a fresh [conda environment](https://conda.io/projects/conda/en/latest/user-guide/tasks/manage-environments.html#activating-an-environment), navigate to the root directory of this repository and issue\n\n```bash\npip install -e .\n```\n\nto install the base package. If you wish to run the examples in `~/examples`, you can run\n\n```bash\npip install -e \".[examples]\"\n```\n\nto install the additional dependencies.\n\n## Use example\n\nUsing Grad DFT typically involves the following steps:\n\n1. Specify an instance of `Molecule`, which has methods to compute the electronic density $\\rho$ and derived quantities.\n2. Define the function `energy_densities`, that computes $\\mathbf{e}\\[\\rho\\](\\mathbf{r})$.\n3. Implement the function `coefficients`, which may include a neural network, and compute $\\mathbf{c}_{\\theta}\\[\\rho\\](\\mathbf{r})$. If the function `coefficients` requires inputs, specify function `coefficient_inputs` too.\n4. Build the `Functional`, which has the method `functional.energy(molecule, params)`, which computes the Kohn-Sham total energy according to\n\n```math\nE_{KS}[\\rho] = \\sum_{i=0}^{\\text{occ}} \\int d\\mathbf{r}\\; |\\nabla \\varphi_{i}(\\mathbf{r})|^2  + \\frac{1}{2}\\int d\\mathbf{r} d\\mathbf{r}'\\frac{\\rho(\\mathbf{r})\\rho(\\mathbf{r}')}{|\\mathbf{r}-\\mathbf{r}'|} +\\int d\\mathbf{r} U(\\mathbf{r}) \\rho(\\mathbf{r}) + E_{II} + E_{xc}[\\rho],\n```\n\nwith\n\n```math\nE_{xc,\\theta}[\\rho] = \\int d\\mathbf{r} \\mathbf{c}_{\\theta}[\\rho](\\mathbf{r})\\cdot\\mathbf{e}[\\rho](\\mathbf{r}),\n```\n\nand where `params` indicates neural network parameters $\\theta$.\n\n5. Train the neural functional using JAX autodifferentiation capabilities, in particular `jax.grad`.\n\nNow let's see how we can complete the above steps with code in Grad DFT.\n\n### Creating a molecule\n\nThe first step is to create a `Molecule` object.\n\n```python\nfrom grad_dft import (\n\tenergy_predictor,\n\tsimple_energy_loss,\n\tNeuralFunctional,\n\tmolecule_from_pyscf\n)\nfrom pyscf import gto, dft\n\n# Define a PySCF mol object for the H2 molecule\nmol = gto.M(atom = [['H', (0, 0, 0)], ['H', (0.74, 0, 0)]], basis = 'def2-tzvp', spin = 0)\n# Create a PySCF mean-field object\nmf = dft.UKS(mol)\nmf.kernel()\n# Create a Molecule from the mean-field object\nmolecule = molecule_from_pyscf(mf)\n```\n\n### Creating a neural functional\n\nA more complex, neural functional can be created as\n\n```python\nfrom jax.nn import sigmoid, gelu\nfrom jax.random import PRNGKey\nfrom flax import linen as nn\nfrom optax import adam, apply_updates\nfrom tqdm import tqdm\n\ndef energy_densities(molecule):\n    rho = molecule.density()\n    lda_e = -3/2 * (3/(4*jnp.pi))**(1/3) * (rho**(4/3)).sum(axis = 1, keepdims = True)\n    return lda_e\n\ndef coefficient_inputs(molecule):\n    rho = molecule.density()\n    kinetic = molecule.kinetic_density()\n    return jnp.concatenate((rho, kinetic))\n\ndef coefficients(self, rhoinputs):\n    x = nn.Dense(features=1)(rhoinputs) # features = 1 means it outputs a single weight\n    x = nn.LayerNorm()(x)\n    return gelu(x)\n\nneuralfunctional = NeuralFunctional(coefficients, energy_densities, coefficient_inputs)\n```\n\nwith the corresponding energy calculation\n\n```python\nkey = PRNGKey(42)\ncinputs = coefficient_inputs(molecule)\nparams = neuralfunctional.init(key, cinputs)\n\npredicted_energy = neuralfunctional.energy(params, molecule)\n```\n\n### Training the neural functional\n\n```python\nlearning_rate = 1e-5\nmomentum = 0.9\ntx = adam(learning_rate=learning_rate, b1=momentum)\nopt_state = tx.init(params)\n\n# and implement the optimization loop\nn_epochs = 20\ncompute_energy = energy_predictor(neuralfunctional)\nfor iteration in tqdm(range(n_epochs), desc=\"Training epoch\"):\n    (cost_value, predicted_energy), grads = simple_energy_loss(\n        params, compute_energy, molecule, ground_truth_energy\n    )\n    print(\"Iteration\", iteration, \"Predicted energy:\", predicted_energy, \"Cost value:\", cost_value)\n    updates, opt_state = tx.update(grads, opt_state, params)\n    params = apply_updates(params, updates)\n\n# Save checkpoint\nneuralfunctional.save_checkpoints(params, tx, step=n_epochs)\n```\n\nFor more detailed examples, check out the `~/examples` folder.\n\n\u003cp align=\"center\"\u003e\n\n\u003cimg src=\"/media/README/light_mode_disodium_animation.gif#gh-light-mode-only\" width=\"45%\" height=\"45%\"/\u003e\n\u003c/p\u003e\n\n\u003cp align=\"center\"\u003e\n\n\u003cimg src=\"/media/README/dark_mode_disodium_animation.gif#gh-dark-mode-only#gh-dark-mode-only\" width=\"45%\" height=\"45%\"/\u003e\n\n\u003c/p\u003e\n\u003cp align=\"center\"\u003e Using a scaled-down version of the neural functional used in the main Grad DFT article, we train it using the total energies and densities derived from the experimental equilibrium geometries of Li\u003csub\u003e2\u003c/sub\u003e and K\u003csub\u003e2\u003c/sub\u003e at the Coupled Cluster Singles \u0026 Doubles (CCSD) level of accuracy. The animation shows that during this training, the neural functional also generalized to predict the CCSD density of Na\u003csub\u003e2\u003c/sub\u003e. \u003c/p\u003e\n\n## Acknowledgements\n\nWe thank helpful comments and insights from Alain Delgado, Modjtaba Shokrian Zini, Stepan Fomichev, Soran Jahangiri, Diego Guala, Jay Soni, Utkarsh Azad, Kasra Hejazi, Vincent Michaud-Rioux, Maria Schuld and Nathan Wiebe.\n\nGradDFT often follows similar calculations and naming conventions as PySCF, though adapted for our purposes. Only a few non-jittable DIIS procedures were directly taken from it. Where this happens, it has been conveniently referenced in the documentation. The tests were also implemented against PySCF results. PySCF Notice file is included for these reasons.\n\n## Bibtex\n\n```latex\n@article{casares2024graddft,\n    author = {Moreno Casares, Pablo Antonio and Baker, Jack S. and Medvidović, Matija and Reis, Roberto dos and Arrazola, Juan Miguel},\n    title = \"{GradDFT. A software library for machine learning enhanced density functional theory}\",\n    journal = {The Journal of Chemical Physics},\n    volume = {160},\n    number = {6},\n    pages = {062501},\n    year = {2024},\n    month = {02},\n    issn = {0021-9606},\n    doi = {10.1063/5.0181037},\n    url = {https://doi.org/10.1063/5.0181037},\n    eprint = {https://pubs.aip.org/aip/jcp/article-pdf/doi/10.1063/5.0181037/19663145/062501\\_1\\_5.0181037.pdf},\n}\n```\n","funding_links":[],"categories":["Density functional theory (ML-DFT)"],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2FXanaduAI%2FGradDFT","html_url":"https://awesome.ecosyste.ms/projects/github.com%2FXanaduAI%2FGradDFT","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2FXanaduAI%2FGradDFT/lists"}