{"id":13448973,"url":"https://github.com/google-deepmind/graphcast","last_synced_at":"2025-05-14T12:08:23.830Z","repository":{"id":185081335,"uuid":"666376652","full_name":"google-deepmind/graphcast","owner":"google-deepmind","description":null,"archived":false,"fork":false,"pushed_at":"2025-01-31T09:22:20.000Z","size":1597,"stargazers_count":6034,"open_issues_count":56,"forks_count":757,"subscribers_count":93,"default_branch":"main","last_synced_at":"2025-05-09T20:50:42.623Z","etag":null,"topics":["weather","weather-forecast"],"latest_commit_sha":null,"homepage":"","language":"Python","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"apache-2.0","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/google-deepmind.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":"CONTRIBUTING.md","funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null,"zenodo":null}},"created_at":"2023-07-14T11:07:57.000Z","updated_at":"2025-05-08T10:55:59.000Z","dependencies_parsed_at":null,"dependency_job_id":"b5d8ef70-176f-4c26-8fc5-aa731e9affae","html_url":"https://github.com/google-deepmind/graphcast","commit_stats":{"total_commits":17,"total_committers":8,"mean_commits":2.125,"dds":0.7647058823529411,"last_synced_commit":"97d1ad50b0b7af4aaed7790167dffa769bae1f2c"},"previous_names":["deepmind/graphcast","google-deepmind/graphcast"],"tags_count":2,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/google-deepmind%2Fgraphcast","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/google-deepmind%2Fgraphcast/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/google-deepmind%2Fgraphcast/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/google-deepmind%2Fgraphcast/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/google-deepmind","download_url":"https://codeload.github.com/google-deepmind/graphcast/tar.gz/refs/heads/main","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":253377267,"owners_count":21898938,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["weather","weather-forecast"],"created_at":"2024-07-31T06:00:26.751Z","updated_at":"2025-05-14T12:08:23.766Z","avatar_url":"https://github.com/google-deepmind.png","language":"Python","readme":"# Google DeepMind GraphCast and GenCast\n\nThis package contains example code to run and train the weather models used in the research papers [GraphCast](https://www.science.org/doi/10.1126/science.adi2336) and [GenCast](https://arxiv.org/abs/2312.15796).\n\nIt also provides pretrained model weights, normalization statistics and example input data on [Google Cloud Bucket](https://console.cloud.google.com/storage/browser/dm_graphcast).\n\nFull model training requires downloading the\n[ERA5](https://www.ecmwf.int/en/forecasts/datasets/reanalysis-datasets/era5)\ndataset, available from [ECMWF](https://www.ecmwf.int/). This can best be\naccessed as Zarr from [Weatherbench2's ERA5 data](https://weatherbench2.readthedocs.io/en/latest/data-guide.html#era5).\n\nData for operational fine-tuning can similarly be accessed at [Weatherbench2's HRES 0th frame data](https://weatherbench2.readthedocs.io/en/latest/data-guide.html#ifs-hres-t-0-analysis).\n\nThese datasets may be governed by separate terms and conditions or license provisions. Your use of such third-party materials is subject to any such terms and you should check that you can comply with any applicable restrictions or terms and conditions before use.\n\n## Overview of files common to models\n\n*   `autoregressive.py`: Wrapper used to run (and train) the one-step predictions\n    to produce a sequence of predictions by auto-regressively feeding the\n    outputs back as inputs at each step, in JAX a differentiable way.\n*   `checkpoint.py`: Utils to serialize and deserialize trees.\n*   `data_utils.py`: Utils for data preprocessing.\n*   `deep_typed_graph_net.py`: General purpose deep graph neural network (GNN)\n    that operates on `TypedGraph`'s where both inputs and outputs are flat\n    vectors of features for each of the nodes and edges.\n*   `grid_mesh_connectivity.py`: Tools for converting between regular grids on a\n    sphere and triangular meshes.\n*   `icosahedral_mesh.py`: Definition of an icosahedral multi-mesh.\n*   `losses.py`: Loss computations, including latitude-weighting.\n*   `mlp.py`: Utils for building MLPs with norm conditioning layers.\n*   `model_utils.py`: Utilities to produce flat node and edge vector features\n    from input grid data, and to manipulate the node output vectors back\n    into a multilevel grid data.\n*   `normalization.py`: Wrapper used to normalize inputs according to historical\n    values, and targets according to historical time differences.\n*   `predictor_base.py`: Defines the interface of the predictor, which models\n    and all of the wrappers implement.\n*   `rollout.py`: Similar to `autoregressive.py` but used only at inference time\n    using a python loop to produce longer, but non-differentiable trajectories.\n*   `typed_graph.py`: Definition of `TypedGraph`'s.\n*   `typed_graph_net.py`: Implementation of simple graph neural network\n    building blocks defined over `TypedGraph`'s that can be combined to build\n    deeper models.\n*   `xarray_jax.py`: A wrapper to let JAX work with `xarray`s.\n*   `xarray_tree.py`: An implementation of tree.map_structure that works with\n    `xarray`s.\n\n## GenCast: Diffusion-based ensemble forecasting for medium-range weather\n\nThis package provides four pretrained models:\n\n1.  `GenCast 0p25deg \u003c2019`, GenCast model at 0.25deg resolution with 13\npressure levels and a 6 times refined icosahedral mesh. This model is trained on\nERA5 data from 1979 to 2018 (inclusive), and can be causally evaluated on 2019\nand later years. This model was described in the paper\n`GenCast: Diffusion-based ensemble forecasting for medium-range weather`\n(https://arxiv.org/abs/2312.15796)\n\n2.  `GenCast 0p25deg Operational \u003c2022`, GenCast model at 0.25deg resolution, with 13 pressure levels and a 6\ntimes refined icosahedral mesh. This model is trained on ERA5 data from\n1979 to 2018, and fine-tuned on HRES-fc0 data from\n2016 to 2021 and can be causally evaluated on 2022 and later years.\nThis model can make predictions in an operational setting (i.e., initialised\nfrom HRES-fc0)\n\n3.  `GenCast 1p0deg \u003c2019`, GenCast model at 1deg resolution, with 13 pressure\nlevels and a 5 times refined icosahedral mesh. This model is\ntrained on ERA5 data from 1979 to 2018, and can be causally evaluated on 2019 and later years.\nThis model has a smaller memory footprint than the 0.25deg models\n\n4. `GenCast 1p0deg Mini \u003c2019`, GenCast model at 1deg resolution, with 13 pressure levels and a\n4 times refined icosahedral mesh. This model is trained on ERA5 data\nfrom 1979 to 2018, and can be causally evaluated on 2019 and later years.\nThis model has the smallest memory footprint of those provided and has been\nprovided to enable low cost demonstrations (for example, it is runnable in a free Colab notebook).\nWhile its performance is reasonable, it is not representative of the performance\nof the GenCast models (1-3) above. For reference, a scorecard comparing its performance to ENS can be found in [docs/](https://github.com/google-deepmind/graphcast/blob/main/docs/GenCast_1p0deg_Mini_ENS_scorecard.png). Note that in this scorecard,\nGenCast Mini only uses 8 member ensembles (vs. ENS' 50) so we use the fair (unbiased)\nCRPS to allow for fair comparison.\n\nThe best starting point is to open `gencast_mini_demo.ipynb` in [Colaboratory](https://colab.research.google.com/github/deepmind/graphcast/blob/master/gencast_mini_demo.ipynb), which gives an\nexample of loading data, generating random weights or loading a `GenCast 1p0deg Mini \u003c2019`\nsnapshot, generating predictions, computing the loss and computing gradients.\nThe one-step implementation of GenCast architecture is provided in\n`gencast.py` and the relevant data, weights and statistics are in the `gencast/`\nsubdir of the Google Cloud Bucket.\n\n### Instructions for running GenCast on Google Cloud compute\n\n[cloud_vm_setup.md](https://github.com/google-deepmind/graphcast/blob/main/docs/cloud_vm_setup.md)\ncontains detailed instructions on launching a Google Cloud TPU VM. This provides\na means of running models (1-3) in the separate `gencast_demo_cloud_vm.ipynb` through [Colaboratory](https://colab.research.google.com/github/deepmind/graphcast/blob/master/gencast_demo_cloud_vm.ipynb).\n\nThe document also provides [instructions](https://github.com/google-deepmind/graphcast/blob/main/docs/cloud_vm_setup.md#running-inference-on-gpu) for running GenCast on a GPU. This requires using a different attention implementation.\n\n### Brief description of relevant library files\n\n*   `denoiser.py`: The GenCast denoiser for one step predictions.\n*   `denoisers_base.py`: Defines the interface of the denoiser.\n*   `dpm_solver_plus_plus_2s.py`: Sampler using DPM-Solver++ 2S from [1].\n*   `gencast.py`: Combines the GenCast model architecture, wrapped as a\n    denoiser, with a sampler to generate predictions.\n*   `nan_cleaning.py`: Wraps a predictor to allow it to work with data\n    cleaned of NaNs. Used to remove NaNs from sea surface temperature.\n*   `samplers_base.py`: Defines the interface of the sampler.\n*   `samplers_utils.py`: Utility methods for the sampler.\n*   `sparse_transformer.py`: General purpose sparse transformer that\n    operates on `TypedGraph`'s where both inputs and outputs are flat vectors of\n    features for each of the nodes and edges. `predictor.py` uses one of these\n    for the mesh GNN.\n*   `sparse_transformer_utils.py`: Utility methods for the sparse\n    transformer.\n*   `transformer.py`: Wraps the mesh transformer, swapping the leading\n    two axes of the nodes in the input graph.\n\n[1] DPM-Solver++: Fast Solver for Guided Sampling of Diffusion Probabilistic\n  Models, https://arxiv.org/abs/2211.01095\n\n## GraphCast: Learning skillful medium-range global weather forecasting\n\nThis package provides three pretrained models:\n\n1.  `GraphCast`, the high-resolution model used in the GraphCast paper (0.25 degree\nresolution, 37 pressure levels), trained on ERA5 data from 1979 to 2017,\n\n2.  `GraphCast_small`, a smaller, low-resolution version of GraphCast (1 degree\nresolution, 13 pressure levels, and a smaller mesh), trained on ERA5 data from\n1979 to 2015, useful to run a model with lower memory and compute constraints,\n\n3.  `GraphCast_operational`, a high-resolution model (0.25 degree resolution, 13\npressure levels) pre-trained on ERA5 data from 1979 to 2017 and fine-tuned on\nHRES data from 2016 to 2021. This model can be initialized from HRES data (does\nnot require precipitation inputs).\n\nThe best starting point is to open `graphcast_demo.ipynb` in [Colaboratory](https://colab.research.google.com/github/deepmind/graphcast/blob/master/graphcast_demo.ipynb), which gives an\nexample of loading data, generating random weights or load a pre-trained\nsnapshot, generating predictions, computing the loss and computing gradients.\nThe one-step implementation of GraphCast architecture, is provided in\n`graphcast.py` and the relevant data, weights and statistics are in the `graphcast/`\nsubdir of the Google Cloud Bucket.\n\nWARNING: For backwards compatibility, we have also left GraphCast data in the top level of the bucket. These will eventually be deleted in favour of the `graphcast/` subdir.\n\n### Brief description of relevant library files:\n\n*   `casting.py`: Wrapper used around GraphCast to make it work using\n    BFloat16 precision.\n*   `graphcast.py`: The main GraphCast model architecture for one-step of\n    predictions.\n*   `solar_radiation.py`: Computes Top-Of-the-Atmosphere (TOA) incident solar\n    radiation compatible with ERA5. This is used as a forcing variable and thus\n    needs to be computed for target lead times in an operational setting.\n\n## Dependencies.\n\n[Chex](https://github.com/deepmind/chex),\n[Dask](https://github.com/dask/dask),\n[Dinosaur](https://github.com/google-research/dinosaur),\n[Haiku](https://github.com/deepmind/dm-haiku),\n[JAX](https://github.com/google/jax),\n[JAXline](https://github.com/deepmind/jaxline),\n[Jraph](https://github.com/deepmind/jraph),\n[Numpy](https://numpy.org/),\n[Pandas](https://pandas.pydata.org/),\n[Python](https://www.python.org/),\n[SciPy](https://scipy.org/),\n[Tree](https://github.com/deepmind/tree),\n[Trimesh](https://github.com/mikedh/trimesh),\n[XArray](https://github.com/pydata/xarray) and\n[XArray-TensorStore](https://github.com/google/xarray-tensorstore).\n\n\n## License and Disclaimers\n\nThe Colab notebooks and the associated code are licensed under the Apache License, Version 2.0. You may obtain a copy of the License at: https://www.apache.org/licenses/LICENSE-2.0.\n\nThe model weights are made available for use under the terms of the Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International (CC BY-NC-SA 4.0). You may obtain a copy of the License at: https://creativecommons.org/licenses/by-nc-sa/4.0/.\n\nThis is not an officially supported Google product.\n\nUnless required by applicable law or agreed to in writing, all software and materials distributed here under the Apache 2.0 or CC-BY-NC-SA 4.0 licenses are distributed on an \"AS IS\" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the licenses for the specific language governing permissions and limitations under those licenses.\n\nGenCast and GraphCast are part of an experimental research project. You are solely responsible for determining the appropriateness of using or distributing GenCast, GraphCast or any outputs generated and assume all risks associated with your use or distribution of GenCast, GraphCast and outputs and your exercise of rights and permissions granted by Google to you under the relevant License. Use discretion before relying on, publishing, downloading or otherwise using GenCast, GraphCast or any outputs generated. GenCast, GraphCast or any outputs generated (i) are not based on data published by; (ii) have not been produced in collaboration with; and (iii) have not been endorsed by any government meteorological agency or department and in no way replaces official alerts, warnings or notices published by such agencies.\n\nCopyright 2024 DeepMind Technologies Limited.\n\n\n## Citations\n\nIf you use this work, consider citing our papers ([blog post](https://deepmind.google/discover/blog/graphcast-ai-model-for-faster-and-more-accurate-global-weather-forecasting/), [Science](https://www.science.org/doi/10.1126/science.adi2336), [arXiv](https://arxiv.org/abs/2212.12794), [arxiv GenCast](https://arxiv.org/abs/2312.15796)):\n\n```latex\n@article{lam2023learning,\n  title={Learning skillful medium-range global weather forecasting},\n  author={Lam, Remi and Sanchez-Gonzalez, Alvaro and Willson, Matthew and Wirnsberger, Peter and Fortunato, Meire and Alet, Ferran and Ravuri, Suman and Ewalds, Timo and Eaton-Rosen, Zach and Hu, Weihua and others},\n  journal={Science},\n  volume={382},\n  number={6677},\n  pages={1416--1421},\n  year={2023},\n  publisher={American Association for the Advancement of Science}\n}\n```\n\n\n```latex\n@article{price2023gencast,\n  title={GenCast: Diffusion-based ensemble forecasting for medium-range weather},\n  author={Price, Ilan and Sanchez-Gonzalez, Alvaro and Alet, Ferran and Andersson, Tom R and El-Kadi, Andrew and Masters, Dominic and Ewalds, Timo and Stott, Jacklynn and Mohamed, Shakir and Battaglia, Peter and Lam, Remi and Willson, Matthew},\n  journal={arXiv preprint arXiv:2312.15796},\n  year={2023}\n}\n```\n\n## Acknowledgements\n\nThe (i) GenCast and GraphCast communicate with and/or reference with the following separate libraries and packages and the colab notebooks include a few examples of ECMWF’s ERA5 and HRES data that can be used as input to the models.\nData and products of the European Centre for Medium-range Weather Forecasts (ECMWF), as modified by Google.\nModified Copernicus Climate Change Service information 2023. Neither the European Commission nor ECMWF is responsible for any use that may be made of the Copernicus information or data it contains.\nECMWF HRES datasets\nCopyright statement: Copyright \"© 2023 European Centre for Medium-Range Weather Forecasts (ECMWF)\".\nSource: www.ecmwf.int\nLicense Statement: ECMWF open data is published under a Creative Commons Attribution 4.0 International (CC BY 4.0). https://creativecommons.org/licenses/by/4.0/\nDisclaimer: ECMWF does not accept any liability whatsoever for any error or omission in the data, their availability, or for any loss or damage arising from their use.\n\nUse of the third-party materials referred to above may be governed by separate terms and conditions or license provisions. Your use of the third-party materials is subject to any such terms and you should check that you can comply with any applicable restrictions or terms and conditions before use.\n\n\n## Contact\n\nFor feedback and questions, contact us at gencast@google.com.\n","funding_links":[],"categories":["Models","Python","🔧 Utilities \u0026 Miscellaneous","Repos","HarmonyOS","🔬 Domain-Specific Applications","Atmosphere","AI Related"],"sub_categories":["✨Official implements","Windows Manager","🌍 Earth \u0026 Climate Science","Meteorological Observation and Forecast"],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fgoogle-deepmind%2Fgraphcast","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fgoogle-deepmind%2Fgraphcast","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fgoogle-deepmind%2Fgraphcast/lists"}