{"id":13418387,"url":"https://github.com/facebookresearch/nevergrad","last_synced_at":"2025-05-13T15:07:29.866Z","repository":{"id":37396568,"uuid":"158468845","full_name":"facebookresearch/nevergrad","owner":"facebookresearch","description":"A Python toolbox for performing gradient-free optimization","archived":false,"fork":false,"pushed_at":"2025-04-14T08:10:35.000Z","size":114727,"stargazers_count":4047,"open_issues_count":127,"forks_count":364,"subscribers_count":56,"default_branch":"main","last_synced_at":"2025-04-20T09:05:19.103Z","etag":null,"topics":[],"latest_commit_sha":null,"homepage":"https://facebookresearch.github.io/nevergrad/","language":"Python","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"mit","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/facebookresearch.png","metadata":{"files":{"readme":"README.md","changelog":"CHANGELOG.md","contributing":".github/CONTRIBUTING.md","funding":null,"license":"LICENSE","code_of_conduct":"CODE_OF_CONDUCT.md","threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null}},"created_at":"2018-11-21T00:33:17.000Z","updated_at":"2025-04-18T09:51:27.000Z","dependencies_parsed_at":"2023-09-21T18:49:53.018Z","dependency_job_id":"8a059e13-8f1a-405f-aefb-cf985362ff63","html_url":"https://github.com/facebookresearch/nevergrad","commit_stats":{"total_commits":1111,"total_committers":61,"mean_commits":18.21311475409836,"dds":0.5562556255625563,"last_synced_commit":"bfc2c23330c99ec8947280449a09d7055a9e4e49"},"previous_names":[],"tags_count":66,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/facebookresearch%2Fnevergrad","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/facebookresearch%2Fnevergrad/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/facebookresearch%2Fnevergrad/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/facebookresearch%2Fnevergrad/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/facebookresearch","download_url":"https://codeload.github.com/facebookresearch/nevergrad/tar.gz/refs/heads/main","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":250229132,"owners_count":21396058,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":[],"created_at":"2024-07-30T22:01:01.701Z","updated_at":"2025-04-22T11:23:06.752Z","avatar_url":"https://github.com/facebookresearch.png","language":"Python","readme":"[![Support Ukraine](https://img.shields.io/badge/Support-Ukraine-FFD500?style=flat\u0026labelColor=005BBB)](https://opensource.fb.com/support-ukraine) [![CircleCI](https://circleci.com/gh/facebookresearch/nevergrad/tree/main.svg?style=svg)](https://circleci.com/gh/facebookresearch/nevergrad/tree/main)\n\n# Nevergrad - A gradient-free optimization platform\n\n![Nevergrad](docs/resources/Nevergrad-LogoMark.png)\n\n\n`nevergrad` is a Python 3.8+ library. It can be installed with:\n\n```\npip install nevergrad\n```\n\nMore installation options, including windows installation, and complete instructions are available in the \"Getting started\" section of the [**documentation**](https://facebookresearch.github.io/nevergrad/).\n\nYou can join Nevergrad users Facebook group [here](https://www.facebook.com/groups/nevergradusers/).\n\nMinimizing a function using an optimizer (here `NGOpt`) is straightforward:\n\n```python\nimport nevergrad as ng\n\ndef square(x):\n    return sum((x - .5)**2)\n\noptimizer = ng.optimizers.NGOpt(parametrization=2, budget=100)\nrecommendation = optimizer.minimize(square)\nprint(recommendation.value)  # recommended value\n\u003e\u003e\u003e [0.49971112 0.5002944]\n```\n\n`nevergrad` can also support bounded continuous variables as well as discrete variables, and mixture of those.\nTo do this, one can specify the input space:\n\n```python\nimport nevergrad as ng\n\ndef fake_training(learning_rate: float, batch_size: int, architecture: str) -\u003e float:\n    # optimal for learning_rate=0.2, batch_size=4, architecture=\"conv\"\n    return (learning_rate - 0.2)**2 + (batch_size - 4)**2 + (0 if architecture == \"conv\" else 10)\n\n# Instrumentation class is used for functions with multiple inputs\n# (positional and/or keywords)\nparametrization = ng.p.Instrumentation(\n    # a log-distributed scalar between 0.001 and 1.0\n    learning_rate=ng.p.Log(lower=0.001, upper=1.0),\n    # an integer from 1 to 12\n    batch_size=ng.p.Scalar(lower=1, upper=12).set_integer_casting(),\n    # either \"conv\" or \"fc\"\n    architecture=ng.p.Choice([\"conv\", \"fc\"])\n)\n\noptimizer = ng.optimizers.NGOpt(parametrization=parametrization, budget=100)\nrecommendation = optimizer.minimize(fake_training)\n\n# show the recommended keyword arguments of the function\nprint(recommendation.kwargs)\n\u003e\u003e\u003e {'learning_rate': 0.1998, 'batch_size': 4, 'architecture': 'conv'}\n```\n\nLearn more on parametrization in the [**documentation**](https://facebookresearch.github.io/nevergrad/)!\n\n![Example of optimization](docs/resources/TwoPointsDE.gif)\n\n*Convergence of a population of points to the minima with two-points DE.*\n\n\n## Documentation\n\nCheck out our [**documentation**](https://facebookresearch.github.io/nevergrad/)! It's still a work in progress, so don't hesitate to submit issues and/or pull requests (PRs) to update it and make it clearer!\nThe last version of our [**data**](https://drive.google.com/file/d/1p8d1bMCDlvWrDIMXP7fT9pJa1cgjH3NM/view?usp=sharing) and the last version of our [**PDF report**](https://tinyurl.com/dagstuhloid). \n\n## Citing\n\n```bibtex\n@misc{nevergrad,\n    author = {J. Rapin and O. Teytaud},\n    title = {{Nevergrad - A gradient-free optimization platform}},\n    year = {2018},\n    publisher = {GitHub},\n    journal = {GitHub repository},\n    howpublished = {\\url{https://GitHub.com/FacebookResearch/Nevergrad}},\n}\n```\n\n## License\n\n`nevergrad` is released under the MIT license. See [LICENSE](LICENSE) for additional details about it.\nSee also our [Terms of Use](https://opensource.facebook.com/legal/terms) and [Privacy Policy](https://opensource.facebook.com/legal/privacy).\n","funding_links":[],"categories":["Optimization","Python","Neural Networks (NN) and Deep Neural Networks (DNN)","参数优化","Machine Learning Framework","Optimizations and fine-tuning","超参数优化和AutoML","AutoML","Resources","Computation and Communication Optimisation","**Programming (learning)**"],"sub_categories":["NN/DNN Techniques Misc","Hyperparameter Search \u0026 Gradient-Free Optimization","**Developer\\'s Tools**"],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Ffacebookresearch%2Fnevergrad","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Ffacebookresearch%2Fnevergrad","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Ffacebookresearch%2Fnevergrad/lists"}