{"id":13586572,"url":"https://github.com/google/caliban","last_synced_at":"2025-04-07T18:34:05.146Z","repository":{"id":43651964,"uuid":"268875034","full_name":"google/caliban","owner":"google","description":"Research workflows made easy, locally and in the Cloud.","archived":false,"fork":false,"pushed_at":"2024-06-06T22:38:20.000Z","size":2459,"stargazers_count":500,"open_issues_count":25,"forks_count":68,"subscribers_count":18,"default_branch":"main","last_synced_at":"2025-03-22T00:41:14.199Z","etag":null,"topics":["ai-platform","docker","google-cloud","python3","research-tool"],"latest_commit_sha":null,"homepage":"https://caliban.readthedocs.io","language":"Python","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"apache-2.0","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/google.png","metadata":{"files":{"readme":"README.md","changelog":"CHANGELOG.md","contributing":"CONTRIBUTING.md","funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":"codemeta.json"}},"created_at":"2020-06-02T18:12:50.000Z","updated_at":"2025-02-26T09:54:38.000Z","dependencies_parsed_at":"2024-01-25T17:56:38.499Z","dependency_job_id":"9d1938c5-643e-4d63-bba7-0b509e829be1","html_url":"https://github.com/google/caliban","commit_stats":{"total_commits":231,"total_committers":10,"mean_commits":23.1,"dds":0.316017316017316,"last_synced_commit":"56f96e7e05b1d33ebdebc01620dc867f7ec54df3"},"previous_names":[],"tags_count":27,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/google%2Fcaliban","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/google%2Fcaliban/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/google%2Fcaliban/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/google%2Fcaliban/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/google","download_url":"https://codeload.github.com/google/caliban/tar.gz/refs/heads/main","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":247707702,"owners_count":20982829,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["ai-platform","docker","google-cloud","python3","research-tool"],"created_at":"2024-08-01T15:05:39.551Z","updated_at":"2025-04-07T18:34:04.557Z","avatar_url":"https://github.com/google.png","language":"Python","readme":"# Caliban\n\n[![Build status](https://github.com/google/caliban/workflows/build/badge.svg?branch=main)](https://github.com/google/caliban/actions?query=workflow%3Abuild+branch%3Amain)\n[![Codecov branch](https://img.shields.io/codecov/c/github/google/caliban/main.svg?maxAge=3600)](https://codecov.io/github/google/caliban/tree/main)\n[![JOSS](https://joss.theoj.org/papers/c33c8b464103b2fb3b641878722bf8f3/status.svg)](https://joss.theoj.org/papers/c33c8b464103b2fb3b641878722bf8f3)\n[![readthedocs](https://img.shields.io/readthedocs/caliban?maxAge=3600)](https://caliban.readthedocs.io/en/latest/?badge=latest)\n[![caliban version](https://img.shields.io/pypi/v/caliban?maxAge=3600)](https://pypi.org/project/caliban)\n\nCaliban is a tool that helps researchers launch and track their numerical\nexperiments in an isolated, reproducible computing environment. It was developed\nby machine learning researchers and engineers, and makes it easy to go from a\nsimple prototype running on a workstation to thousands of experimental jobs\nrunning on Cloud.\n\nWith Caliban, you can:\n\n- Develop your experimental code locally and test it inside an isolated (Docker)\n  environment\n- Easily sweep over experimental parameters\n- Submit your experiments as Cloud jobs, where they will run in the same\n  isolated environment\n- Control and keep track of jobs\n\n## Quickstart\n\n[Install Docker](#docker), make sure it's running, then install Caliban (you'll need [Python \u003e= 3.6](#python-36)):\n\n```bash\npip install caliban\n```\n\nTrain a simple deep learning model on your local machine:\n\n```bash\ngit clone https://github.com/google/caliban.git \u0026\u0026 cd caliban/tutorials/basic\ncaliban run --nogpu mnist.py\n```\n\nSweep over learning rates to find the best one (flags are specified in JSON format):\n\n```bash\necho '{\"learning_rate\": [0.01, 0.001, 0.0001]}' | caliban run --experiment_config stdin --nogpu mnist.py\n```\n\n**Next**:\n\n- See how to submit the experiment to Cloud and use other Caliban features in [\"Getting Started with Caliban\"](#getting-started-with-caliban)\n- See [Installation](#installation-and-prerequisites) for detailed installation instructions\n- Read the [Command Overview](#command-overview) for info on Caliban commands.\n\nFull documentation for Caliban lives at [Read The Docs](https://caliban.readthedocs.io/en/latest).\n\n### Dramatic Interlude\n\n\u003cp\u003e\n\u003cimg style=\"float: right;\" align=\"right\" src=\"https://upload.wikimedia.org/wikipedia/commons/a/ad/Stephano%2C_Trinculo_and_Caliban_dancing_from_The_Tempest_by_Johann_Heinrich_Ramberg.jpg\" width=\"350\"\u003e\n\n\u003e “Be not afeard; the isle is full of noises, \\\n\u003e Sounds, and sweet airs, that give delight and hurt not. \\\n\u003e Sometimes a thousand twangling instruments \\\n\u003e Will hum about mine ears; and sometime voices, \\\n\u003e That, if I then had waked after long sleep, \\\n\u003e Will make me sleep again: and then, in dreaming, \\\n\u003e The clouds methought would open, and show riches \\\n\u003e Ready to drop upon me; that, when I waked, \\\n\u003e I cried to dream again.”\n\u003e\n\u003e -- \u003ccite\u003eShakespeare, The Tempest\u003c/cite\u003e\n\u003c/p\u003e\n\n## Installation and Prerequisites\n\nCaliban's prequisites are [Docker](#docker) and [Python \u003e= 3.6](#python-36).\n\nMake sure your Python is up to date:\n\n```bash\n$ python --version\nPython 3.6.9 # should be \u003e=3.6.0\n```\n\nIf not, visit [\"Installing Python 3.6\"](#python-36) before proceeding.\n\nNext, install Caliban via [pip](https://pypi.org/project/caliban/):\n\n```bash\npip install -U caliban\n```\n\ncheck if your installation worked by navigating to an empty folder and running\n`caliban --help`. You should see the usage dialogue:\n\n```bash\n$ caliban --help\nusage: caliban [-h] [--helpfull] [--version]\n               {shell,notebook,build,run,cloud,cluster,status,stop,resubmit}\n               ...\n```\n\n### Docker\n\nCaliban executes your code inside a \"container\", managed by\n[Docker](https://hub.docker.com/editions/community/docker-ce-desktop-mac). To get Docker:\n\n- On MacOS, follow the installation instructions at [Docker\n  Desktop](https://hub.docker.com/editions/community/docker-ce-desktop-mac) and\n  start the newly-installed Docker Desktop application.\n- On Linux, visit the [Docker installation\n  instructions](https://docs.docker.com/engine/install/ubuntu/#installation-methods).\n  (It's important that you configure [sudo-less\n  Docker](https://caliban.readthedocs.io/en/latest/getting_started/prerequisites.html#docker)\n  and start Docker running on your machine.)\n\nMake sure Docker is correctly installed, configured and running by executing the\nfollowing command:\n\n```bash\ndocker run hello-world\n```\n\nYou should see output that looks like this:\n\n```text\n...\nHello from Docker!\nThis message shows that your installation appears to be working correctly.\n...\n```\n\n### Python 3.6\n\nMake sure your Python version is up to date:\n\n```bash\n$ python --version\nPython 3.6.9 # should be \u003e=3.6.0\n```\n\nIf you need to upgrade:\n\n- On MacOS, install the latest Python version from\n  [python.org](https://www.python.org/downloads/mac-osx) ([direct\n  link](https://www.python.org/ftp/python/3.8.3/python-3.8.3-macosx10.9.pkg)).\n- On Linux, run `sudo apt-get update \u0026\u0026 sudo apt-get install python3.7`.\n\n### Cloud Submission and GPUs\n\nCaliban's [Read the Docs](https://caliban.readthedocs.io/) documentation has\ninstructions on:\n\n- [Installing the `nvidia-docker2`\n  runtime](https://caliban.readthedocs.io/en/latest/getting_started/prerequisites.html#docker-and-cuda),\n  so you can use Caliban to run jobs that use your Linux machine's GPU.\n- [Setting up a Google Cloud\n  account](https://caliban.readthedocs.io/en/latest/getting_started/cloud.html)\n  so you can submit your code to Google's [Cloud AI\n  Platform](https://cloud.google.com/ai-platform) with `caliban cloud`.\n\n## Getting Started with Caliban\n\nIn this section we will use Caliban to train an image classification network\n(implemented in\n[TensorFlow](https://www.tensorflow.org/tutorials/quickstart/beginner)). We\nwill:\n\n- Train a neural network on the local machine\n- Increase the model's accuracy by changing the [learning\n  rate](https://medium.com/octavian-ai/which-optimizer-and-learning-rate-should-i-use-for-deep-learning-5acb418f9b2)\n  with a command-line flag\n- Sweep across a range of learning rates with Caliban's [experiment\n  broadcasting](https://caliban.readthedocs.io/en/latest/explore/experiment_broadcasting.html)\n  feature\n- Train the model in the Cloud on Google's [AI\n  Platform](https://cloud.google.com/ai-platform)\n- Develop code interactively using `caliban shell` in the exact same\n  environment.\n\n### Preparing your Project\n\nCreate an empty directory and use `curl` to download a [python\nscript](https://github.com/google/caliban/blob/main/tutorials/basic/mnist.py#L16)\nthat trains a basic neural network.\n\n```\nmkdir demo \u0026\u0026 cd demo\ncurl --output mnist.py https://raw.githubusercontent.com/google/caliban/main/tutorials/basic/mnist.py\n```\n\nCreate a file called `requirements.txt` to declare `tensorflow-cpu` as a dependency:\n\n```bash\necho \"tensorflow-cpu\" \u003e requirements.txt\n```\n\nCaliban will automatically make any entry in `requirements.txt` available when\nyou run your code. See [\"Declaring\nRequirements\"](https://caliban.readthedocs.io/en/latest/explore/declaring_requirements.html)\nfor more information.\n\n### Training the Network\n\nRun this command to train your first ML model:\n\n```bash\ncaliban run --nogpu mnist.py\n```\n\nYou should see a stream of output ending in this:\n\n```text\nTraining model with learning rate=0.1 for 3 epochs.\nEpoch 1/3\n1875/1875 - 3s - loss: 2.0989 - accuracy: 0.2506\nEpoch 2/3\n1875/1875 - 3s - loss: 1.9222 - accuracy: 0.2273\nEpoch 3/3\n1875/1875 - 3s - loss: 2.0777 - accuracy: 0.1938\nModel performance:\n313/313 - 0s - loss: 2.0973 - accuracy: 0.1858\n```\n\nYour model was able to recognize digits from the\n[MNIST](https://en.wikipedia.org/wiki/MNIST_database) dataset with 18.58%\naccuracy. Can we do better?\n\n### Improving the Model\n\nThe default learning rate is `0.1`. Run the code again with a smaller learning\nrate by passing a command-line flag, separated from your original command by\n`--`:\n\n```bash\n$ caliban run --nogpu mnist.py -- --learning_rate 0.01\n\n\u003c\u003celided\u003e\u003e\n\nTraining model with learning rate=0.01 for 3 epochs.\nEpoch 1/3\n1875/1875 - 4s - loss: 0.2676 - accuracy: 0.9221\nEpoch 2/3\n1875/1875 - 4s - loss: 0.1863 - accuracy: 0.9506\nEpoch 3/3\n1875/1875 - 4s - loss: 0.1567 - accuracy: 0.9585\nModel performance:\n313/313 - 0s - loss: 0.1410 - accuracy: 0.9642\n```\n\n96% accuracy! Much better! Can we do better still?\n\n### Experiment Broadcasting\n\nCaliban's [experiment\nbroadcasting](https://caliban.readthedocs.io/en/latest/explore/experiment_broadcasting.html)\nfeature will allow us to run many jobs with different sets of arguments.\n\nCreate a file called `experiment.json` with a\n[JSON](https://www.json.org/json-en.html) dictionary of the format\n`{\"flag_name\": [\"list\", \"of\", \"values\"]}`:\n\n```bash\necho '{\"learning_rate\": [0.01, 0.001, 0.0001]}' \u003e experiment.json\n```\n\nPass the config with `--experiment_config` and run again:\n\n```bash\ncaliban run --experiment_config experiment.json --nogpu mnist.py\n```\n\nYou should see accuracies of roughly `0.9493`, `0.9723` and `0.9537`. Looks like\n`0.001` is a nice choice.\n\n### Submitting to Cloud AI Platform\n\nNow it's time to submit the job to [Cloud AI\nPlatform](https://cloud.google.com/ai-platform).\n\n(**NOTE**: This section requires a Google Cloud account. You can create a free\naccount with $300 of credit to get started. Follow Caliban's [\"Getting Started\nwith Google\nCloud\"](https://caliban.readthedocs.io/en/latest/getting_started/cloud.html)\ndocumentation, then come back here to proceed.)\n\nSubmit the job to AI Platform by changing the word `run` to `cloud`:\n\n```bash\ncaliban cloud --nogpu mnist.py -- --learning_rate 0.01\n```\n\nYou should see output like this:\n\n```bash\nI0615 19:57:43.354172 4563361216 core.py:161] Job 1 - jobId: caliban_totoro_1, image: gcr.io/research-3141/974a776e6037:latest\nI0615 19:57:43.354712 4563361216 core.py:161] Job 1 - Accelerator: {'count': 0, 'type': 'ACCELERATOR_TYPE_UNSPECIFIED'}, machine: 'n1-highcpu-32', region: 'us-central1'\nI0615 19:57:43.355082 4563361216 core.py:161] Job 1 - Experiment arguments: ['--learning_rate', '0.01']\nI0615 19:57:43.355440 4563361216 core.py:161] Job 1 - labels: {'gpu_enabled': 'false', 'tpu_enabled': 'false', 'job_name': 'caliban_totoro', 'learning_rate': '0_01'}\n\nI0615 19:57:43.356621 4563361216 core.py:324] Submitting request!\nI0615 19:57:45.078382 4563361216 core.py:97] Request for job 'caliban_totoro_20200615_195743_1' succeeded!\nI0615 19:57:45.078989 4563361216 core.py:98] Job URL: https://console.cloud.google.com/ai-platform/jobs/caliban_totoro_20200615_195743_1?projectId=totoro-project\nI0615 19:57:45.079524 4563361216 core.py:100] Streaming log CLI command: $ gcloud ai-platform jobs stream-logs caliban_totoro_20200615_195743_1\nSubmitting caliban_totoro_1: 100%|####################################################################################################################################################################################| 1/1 [00:02\u003c00:00,  2.65s/requests]\nI0615 19:57:45.405600 4563361216 core.py:673]\nI0615 19:57:45.405819 4563361216 core.py:676] Visit https://console.cloud.google.com/ai-platform/jobs/?projectId=research-3141 to see the status of all jobs.\nI0615 19:57:45.405959 4563361216 core.py:677]\n```\n\nThis output means that Caliban has:\n\n- built a Docker container with all of your code\n- Pushed that container up to Google Cloud's [Container\n  Registry](https://cloud.google.com/container-registry)\n- Submitted the job to [AI Platform](https://cloud.google.com/ai-platform).\n\nYou can now visit the link in the output that looks like:\nhttps://console.cloud.google.com/ai-platform/jobs/caliban_totoro_20200615_195743_1?projectId=totoro-project\nto see all of your job's logs.\n\n#### Why do I need Cloud?\n\nWith Google Cloud, you can use on-demand\n[GPUs](https://caliban.readthedocs.io/en/latest/cloud/gpu_specs.html) and\n[TPUs](https://caliban.readthedocs.io/en/latest/cloud/ai_platform_tpu.html) and\ntrain models on large datasets at very high speeds. You can also customize the\n[machine\ntype](https://caliban.readthedocs.io/en/latest/cloud/gpu_specs.html#custom-machine-types)\nthat AI Platform uses to run your job. You might need high memory or more CPU,\nfor example.\n\nSee Caliban's [\"Customizing Machines and\nGPUs\"](https://caliban.readthedocs.io/en/latest/cloud/gpu_specs.html#) for more\ninformation.\n\n### Interactive Development with `caliban shell`\n\n[`caliban\nshell`](https://caliban.readthedocs.io/en/latest/cli/caliban_shell.html) lets\nyou develop code interactively inside of the exact same environment that your\ncode will have available, locally during `caliban run` or in the Cloud with\n`caliban cloud`.\n\nRun the following command to activate the shell:\n\n```bash\ncaliban shell --nogpu\n```\n\nYou should see Caliban's terminal:\n\n```\nI0611 12:33:17.551121 4500135360 docker.py:911] Running command: docker run --ipc host -w /usr/app -u 735994:89939 -v /Users/totoro/code/example:/usr/app -it --entrypoint /bin/bash -v /Users/totoro:/home/totoro ab8a7d7db868\n   _________    __    ________  ___    _   __  __  __\n  / ____/   |  / /   /  _/ __ )/   |  / | / /  \\ \\ \\ \\\n / /   / /| | / /    / // __  / /| | /  |/ /    \\ \\ \\ \\\n/ /___/ ___ |/ /____/ // /_/ / ___ |/ /|  /     / / / /\n\\____/_/  |_/_____/___/_____/_/  |_/_/ |_/     /_/ /_/\n\nYou are running caliban shell as user with ID 735994 and group 89939,\nwhich should map to the ID and group for your user on the Docker host. Great!\n\n[totoro@6a9b28990757 /usr/app]$\n```\n\nYou're now living in an isolated [Docker\ncontainer](https://www.docker.com/resources/what-container) with your\n`tensorflow-cpu` dependency available (and any others [you've\ndeclared](https://caliban.readthedocs.io/en/latest/explore/declaring_requirements.html)).\n\nRun the `python` command and check that `tensorflow` is installed:\n\n```bash\n$ python\nPython 3.6.9 (default, Nov  7 2019, 10:44:02)\n[GCC 8.3.0] on linux\nType \"help\", \"copyright\", \"credits\" or \"license\" for more information.\n\u003e\u003e\u003e import tensorflow as tf\n\u003e\u003e\u003e tf.__version__\n'2.2.0'\n```\n\nYour home directory and the folder where you ran the command are both mounted\ninto this isolated environment, so any changes you make to either of those\ndirectories will be reflected immediately.\n\nAny code you add to the current folder and edit on your computer will be\navailable in this special Caliban shell. Run the example from before like this:\n\n```\npython mnist.py --learning_rate 0.01\n```\n\nIf your code runs in `caliban shell`, you can be almost certain that your code\nwill execute in a Cloud environment, with potentially many GPUs attached and\nmuch larger machines available.\n\n### What next?\n\nRead the [Overview](#overview) for more information on Caliban's subcommands,\nthen head over to [Caliban's documentation\nsite](https://caliban.readthedocs.io/en/latest/) and check out the links on the\nsidebar.\n\nIf you find anything confusing, please feel free to [create an\nissue](https://github.com/google/caliban/issues) on our [Github Issues\npage](https://github.com/google/caliban/issues), and we'll get you sorted out.\n\n## Command Overview\n\nCaliban provides seven subcommands that you run inside some project directory on\nyour machine:\n\n* [`caliban\n  shell`](https://caliban.readthedocs.io/en/latest/cli/caliban_shell.html)\n  generates a Docker image containing any dependencies you've declared in a\n  `requirements.txt` and/or `setup.py` in the directory and opens an interactive\n  shell in that directory. The `caliban shell` environment is ~identical to the\n  environment that will be available to your code when you submit it to AI\n  Platform; the difference is that your current directory is live-mounted into\n  the container, so you can develop interactively.\n\n* [`caliban\n  notebook`](https://caliban.readthedocs.io/en/latest/cli/caliban_notebook.html)\n  starts a Jupyter notebook or lab instance inside of a Docker image containing\n  your dependencies; the guarantee about an environment identical to AI Platform\n  applies here as well.\n\n* [`caliban run`](https://caliban.readthedocs.io/en/latest/cli/caliban_run.html)\n  packages your directory's code into the Docker image and executes it locally\n  using `docker run`. If you have a GPU, the instance will attach to it by\n  default - no need to install the CUDA toolkit. The Docker environment takes\n  care of all that. This environment is truly identical to the AI Platform\n  environment. The Docker image that runs locally is the same image that will\n  run in AI Platform.\n\n* [`caliban\n  cloud`](https://caliban.readthedocs.io/en/latest/cli/caliban_cloud.html)\n  allows you to [submit jobs to AI\n  Platform](https://caliban.readthedocs.io/en/latest/getting_started/cloud.html)\n  that will run inside the same Docker image you used with `caliban run`. You\n  can submit hundreds of jobs at once. Any machine type, GPU count, and GPU type\n  combination you specify will be validated client side, so you'll see an\n  immediate error with suggestions, rather than having to debug by submitting\n  jobs over and over.\n\n* [`caliban\n  build`](https://caliban.readthedocs.io/en/latest/cli/caliban_build.html) builds\n  the Docker image used in `caliban cloud` and `caliban run` without actually\n  running the container or submitting any code.\n\n* [`caliban\n  cluster`](https://caliban.readthedocs.io/en/latest/cli/caliban_cluster.html)\n  creates GKE clusters and submits jobs to GKE clusters.\n\n* [`caliban\n  status`](https://caliban.readthedocs.io/en/latest/cli/caliban_status.html)\n  displays information about all jobs submitted by Caliban, and makes it easy to\n  interact with large groups of experiments. Use `caliban status` when you need\n  to cancel pending jobs, or re-build a container and resubmit a batch of\n  experiments after fixing a bug.\n\n## Disclaimer\n\nThis is a research project, not an official Google product. Expect bugs and\nsharp edges. Please help by trying out Caliban, [reporting\nbugs](https://github.com/google/caliban/issues), and letting us know what you\nthink!\n\n## Get Involved + Get Support\n\nPull requests and bug reports are always welcome! Check out our [Contributor's\nGuide](CONTRIBUTING.md) for information on how to get started contributing to\nCaliban.\n\nThe TL;DR; is:\n\n- send us a pull request,\n- iterate on the feedback + discussion, and\n- get a +1 from a [Committer](COMMITTERS.md)\n\nin order to get your PR accepted.\n\nIssues should be reported on the [GitHub issue\ntracker](https://github.com/google/caliban/issues).\n\nIf you want to discuss an idea for a new feature or ask us a question,\ndiscussion occurs primarily in the body of [Github\nIssues](https://github.com/google/caliban/issues), though the project is growing\nlarge enough that we may start a Gitter channel soon.\n\nThe current list of active committers (who can +1 a pull request) can be found\nhere: [COMMITTERS.md](COMMITTERS.md)\n\nA list of contributors to the project can be found at the project's\n[Contributors](https://github.com/google/caliban/graphs/contributors) page.\n\n## Citing Caliban\n\nIf Caliban helps you in your research, please consider citing Caliban's\nassociated academic paper:\n\n```\n@article{Ritchie2020,\n  doi = {10.21105/joss.02403},\n  url = {https://doi.org/10.21105/joss.02403},\n  year = {2020},\n  publisher = {The Open Journal},\n  volume = {5},\n  number = {53},\n  pages = {2403},\n  author = {Sam Ritchie and Ambrose Slone and Vinay Ramasesh},\n  title = {Caliban: Docker-based job manager for reproducible workflows},\n  journal = {Journal of Open Source Software}\n}\n```\n\n## License\n\nCopyright 2020 Google LLC.\n\nLicensed under the [Apache License, Version 2.0](http://www.apache.org/licenses/LICENSE-2.0).\n","funding_links":[],"categories":["Python","其他_机器学习与深度学习"],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fgoogle%2Fcaliban","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fgoogle%2Fcaliban","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fgoogle%2Fcaliban/lists"}