{"id":13570051,"url":"https://github.com/dragonfly/dragonfly","last_synced_at":"2025-04-04T06:31:43.232Z","repository":{"id":39526211,"uuid":"130418835","full_name":"dragonfly/dragonfly","owner":"dragonfly","description":"An open source python library for scalable Bayesian optimisation.","archived":false,"fork":false,"pushed_at":"2023-06-19T20:23:17.000Z","size":1087,"stargazers_count":871,"open_issues_count":43,"forks_count":235,"subscribers_count":29,"default_branch":"master","last_synced_at":"2025-03-06T18:03:13.532Z","etag":null,"topics":[],"latest_commit_sha":null,"homepage":"","language":"Python","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"mit","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/dragonfly.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":"LICENSE.txt","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":"AUTHORS.txt"}},"created_at":"2018-04-20T22:19:50.000Z","updated_at":"2025-03-03T07:44:32.000Z","dependencies_parsed_at":"2024-01-07T03:51:29.370Z","dependency_job_id":"4777d10f-9293-454f-a9ab-3b71408dac93","html_url":"https://github.com/dragonfly/dragonfly","commit_stats":{"total_commits":350,"total_committers":13,"mean_commits":"26.923076923076923","dds":0.5428571428571429,"last_synced_commit":"3eef7d30bcc2e56f2221a624bd8ec7f933f81e40"},"previous_names":[],"tags_count":0,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/dragonfly%2Fdragonfly","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/dragonfly%2Fdragonfly/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/dragonfly%2Fdragonfly/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/dragonfly%2Fdragonfly/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/dragonfly","download_url":"https://codeload.github.com/dragonfly/dragonfly/tar.gz/refs/heads/master","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":247134458,"owners_count":20889399,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":[],"created_at":"2024-08-01T14:00:47.685Z","updated_at":"2025-04-04T06:31:38.224Z","avatar_url":"https://github.com/dragonfly.png","language":"Python","readme":"\n\u003cimg src=\"https://dragonfly.github.io/images/dragonfly_bigwords.png\"/\u003e\n\n---\n\n\nDragonfly is an open source python library for scalable Bayesian optimisation.\n\nBayesian optimisation is used for optimising black-box functions whose evaluations are\nusually expensive. Beyond vanilla optimisation techniques, Dragonfly provides an array of tools to\nscale up Bayesian optimisation to expensive large scale problems.\nThese include features/functionality that are especially suited for\nhigh dimensional optimisation (optimising for a large number of variables),\nparallel evaluations in synchronous or asynchronous settings (conducting multiple\nevaluations in parallel), multi-fidelity optimisation (using cheap approximations\nto speed up the optimisation process), and multi-objective optimisation (optimising\nmultiple functions simultaneously).\n\nDragonfly is compatible with Python2 (\u003e= 2.7) and Python3 (\u003e= 3.5) and has been tested\non Linux, macOS, and Windows platforms.\nFor documentation, installation, and a getting started guide, see our\n[readthedocs page](https://dragonfly-opt.readthedocs.io). For more details, see\nour [paper](https://arxiv.org/abs/1903.06694).\n\n\u0026nbsp;\n\n## Installation\n\nSee \n[here](https://dragonfly-opt.readthedocs.io/en/master/install/)\nfor detailed instructions on installing Dragonfly and its dependencies.\n\n**Quick Installation:**\nIf you have done this kind of thing before, you should be able to install\nDragonfly via `pip`.\n\n```bash\n$ sudo apt-get install python-dev python3-dev gfortran # On Ubuntu/Debian\n$ pip install numpy\n$ pip install dragonfly-opt -v\n```\n\n\n**Testing the Installation**:\nYou can import Dragonfly in python to test if it was installed properly.\nIf you have installed via source, make sure that you move to a different directory \n to avoid naming conflicts.\n```bash\n$ python\n\u003e\u003e\u003e from dragonfly import minimise_function\n\u003e\u003e\u003e # The first argument below is the function, the second is the domain, and the third is the budget.\n\u003e\u003e\u003e min_val, min_pt, history = minimise_function(lambda x: x ** 4 - x**2 + 0.1 * x, [[-10, 10]], 10);  \n...\n\u003e\u003e\u003e min_val, min_pt\n(-0.32122746026750953, array([-0.7129672]))\n```\nDue to stochasticity in the algorithms, the above values for `min_val`, `min_pt` may be\ndifferent. If you run it for longer (e.g.\n`min_val, min_pt, history = minimise_function(lambda x: x ** 4 - x**2 + 0.1 * x, [[-10, 10]], 100)`),\nyou should get more consistent values for the minimum. \n\n\nIf the installation fails or if there are warning messages, see detailed instructions\n[here](https://dragonfly-opt.readthedocs.io/en/master/install/).\n\n\n\u0026nbsp;\n\n## Quick Start\n\nDragonfly can be\nused directly in the command line by calling\n[`dragonfly-script.py`](bin/dragonfly-script.py)\nor be imported in python code via the `maximise_function` function in the main library\nor in \u003cem\u003eask-tell\u003c/em\u003e mode.\nTo help get started, we have provided some examples in the\n[`examples`](examples) directory.\nSee our readthedocs getting started pages\n([command line](https://dragonfly-opt.readthedocs.io/en/master/getting_started_cli/),\n[Python](https://dragonfly-opt.readthedocs.io/en/master/getting_started_py/),\n[Ask-Tell](https://dragonfly-opt.readthedocs.io/en/master/getting_started_ask_tell/))\nfor examples and use cases.\n\n**Command line**:\nBelow is an example usage in the command line.\n```bash\n$ cd examples\n$ dragonfly-script.py --config synthetic/branin/config.json --options options_files/options_example.txt\n```\n\n**In Python code**:\nThe main APIs for Dragonfly are defined in\n[`dragonfly/apis`](dragonfly/apis).\nFor their definitions and arguments, see\n[`dragonfly/apis/opt.py`](dragonfly/apis/opt.py) and\n[`dragonfly/apis/moo.py`](dragonfly/apis/moo.py).\nYou can import the main API in python code via,\n```python\nfrom dragonfly import minimise_function, maximise_function\nfunc = lambda x: x ** 4 - x**2 + 0.1 * x\ndomain = [[-10, 10]]\nmax_capital = 100\nmin_val, min_pt, history = minimise_function(func, domain, max_capital)\nprint(min_val, min_pt)\nmax_val, max_pt, history = maximise_function(lambda x: -func(x), domain, max_capital)\nprint(max_val, max_pt)\n```\nHere, `func` is the function to be maximised,\n`domain` is the domain over which `func` is to be optimised,\nand `max_capital` is the capital available for optimisation.\nThe domain can be specified via a JSON file or in code.\nSee\n[here](examples/synthetic/branin/in_code_demo.py),\n[here](examples/synthetic/hartmann6_4/in_code_demo.py),\n[here](examples/synthetic/discrete_euc/in_code_demo_1.py),\n[here](examples/synthetic/discrete_euc/in_code_demo_2.py),\n[here](examples/synthetic/hartmann3_constrained/in_code_demo.py),\n[here](examples/synthetic/park1_constrained/in_code_demo.py),\n[here](examples/synthetic/borehole_constrained/in_code_demo.py),\n[here](examples/synthetic/multiobjective_branin_currinexp/in_code_demo.py),\n[here](examples/synthetic/multiobjective_hartmann/in_code_demo.py),\n[here](examples/tree_reg/in_code_demo.py),\nand\n[here](examples/nas/demo_nas.py)\nfor more detailed examples.\n\n**In Ask-Tell Mode**:\nAsk-tell mode provides you more control over your experiments where you can supply past results\nto our API in order to obtain a recommendation.\nSee the [following example](examples/detailed_use_cases/in_code_demo_ask_tell.py) for more details.\n\n\nFor a comprehensive list of uses cases, including multi-objective optimisation,\nmulti-fidelity optimisation, neural architecture search, and other optimisation\nmethods (besides Bayesian optimisation), see our readthe docs pages\n([command line](https://dragonfly-opt.readthedocs.io/en/master/getting_started_cli/),\n[Python](https://dragonfly-opt.readthedocs.io/en/master/getting_started_py/),\n[Ask-Tell](https://dragonfly-opt.readthedocs.io/en/master/getting_started_ask_tell/))).\n\n\n\u0026nbsp;\n\n### Contributors\n\nKirthevasan Kandasamy: [github](https://github.com/kirthevasank),\n[webpage](http://www.cs.cmu.edu/~kkandasa/)  \nKarun Raju Vysyaraju: [github](https://github.com/karunraju),\n[linkedin](https://www.linkedin.com/in/karunrajuvysyaraju)  \nAnthony Yu: [github](https://github.com/anthonyhsyu),\n[linkedin](https://www.linkedin.com/in/anthony-yu-5239a877/)  \nWillie Neiswanger: [github](https://github.com/willieneis),\n[webpage](http://www.cs.cmu.edu/~wdn/)  \nBiswajit Paria: [github](https://github.com/biswajitsc),\n[webpage](https://biswajitsc.github.io/)  \nChris Collins: [github](https://github.com/crcollins/),\n[webpage](https://www.crcollins.com/)  \n\n\n### Acknowledgements\nResearch and development of the methods in this package were funded by\nDOE grant DESC0011114, NSF grant IIS1563887, the DARPA D3M program, and AFRL.\n\n\n### Citation\nIf you use any part of this code in your work, please cite our\n[JMLR paper](http://jmlr.org/papers/v21/18-223.html).\n\n```\n@article{JMLR:v21:18-223,\n  author  = {Kirthevasan Kandasamy and Karun Raju Vysyaraju and Willie Neiswanger and Biswajit Paria and Christopher R. Collins and Jeff Schneider and Barnabas Poczos and Eric P. Xing},\n  title   = {Tuning Hyperparameters without Grad Students: Scalable and Robust Bayesian Optimisation with Dragonfly},\n  journal = {Journal of Machine Learning Research},\n  year    = {2020},\n  volume  = {21},\n  number  = {81},\n  pages   = {1-27},\n  url     = {http://jmlr.org/papers/v21/18-223.html}\n}\n```\n\n### License\nThis software is released under the MIT license. For more details, please refer\n[LICENSE.txt](https://github.com/dragonfly/dragonfly/blob/master/LICENSE.txt).\n\nFor questions, please email kandasamy@cs.cmu.edu.\n\n\"Copyright 2018-2019 Kirthevasan Kandasamy\"\n\n\n","funding_links":[],"categories":["Python","Software","AutoML","参数优化","Profiling","Machine Learning Framework","Scheduling","Tools and projects","超参数优化和AutoML","Libraries","Uncategorized","4.) Hyperparameter Optimization"],"sub_categories":["Optimization","Profiling","Hyperparameter Search \u0026 Gradient-Free Optimization","LLM","Uncategorized","**[Papers]**"],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fdragonfly%2Fdragonfly","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fdragonfly%2Fdragonfly","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fdragonfly%2Fdragonfly/lists"}