{"id":13612503,"url":"https://github.com/locuslab/optnet","last_synced_at":"2025-04-05T18:11:56.724Z","repository":{"id":41390255,"uuid":"83608127","full_name":"locuslab/optnet","owner":"locuslab","description":"OptNet: Differentiable Optimization as a Layer in Neural Networks","archived":false,"fork":false,"pushed_at":"2020-03-26T17:17:08.000Z","size":1517,"stargazers_count":526,"open_issues_count":2,"forks_count":75,"subscribers_count":26,"default_branch":"master","last_synced_at":"2025-03-29T17:12:48.585Z","etag":null,"topics":["deep-learning","machine-learning","optimization","paper","pytorch"],"latest_commit_sha":null,"homepage":null,"language":"Python","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"apache-2.0","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/locuslab.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null}},"created_at":"2017-03-01T22:27:29.000Z","updated_at":"2025-03-27T14:59:22.000Z","dependencies_parsed_at":"2022-09-21T07:50:44.382Z","dependency_job_id":null,"html_url":"https://github.com/locuslab/optnet","commit_stats":null,"previous_names":[],"tags_count":1,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/locuslab%2Foptnet","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/locuslab%2Foptnet/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/locuslab%2Foptnet/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/locuslab%2Foptnet/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/locuslab","download_url":"https://codeload.github.com/locuslab/optnet/tar.gz/refs/heads/master","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":247378152,"owners_count":20929297,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["deep-learning","machine-learning","optimization","paper","pytorch"],"created_at":"2024-08-01T20:00:30.850Z","updated_at":"2025-04-05T18:11:56.685Z","avatar_url":"https://github.com/locuslab.png","language":"Python","readme":"# OptNet: Differentiable Optimization as a Layer in Neural Networks\n\nThis repository is by [Brandon Amos](http://bamos.github.io)\nand [J. Zico Kolter](http://zicokolter.com)\nand contains the [PyTorch](https://pytorch.org) source code to\nreproduce the experiments in our ICML 2017 paper\n[OptNet: Differentiable Optimization as a Layer in Neural Networks](https://arxiv.org/abs/1703.00443).\n\nIf you find this repository helpful in your publications,\nplease consider citing our paper.\n\n```\n@InProceedings{amos2017optnet,\n  title = {{O}pt{N}et: Differentiable Optimization as a Layer in Neural Networks},\n  author = {Brandon Amos and J. Zico Kolter},\n  booktitle = {Proceedings of the 34th International Conference on Machine Learning},\n  pages = {136--145},\n  year = {2017},\n  volume = {70},\n  series = {Proceedings of Machine Learning Research},\n  publisher ={PMLR},\n}\n```\n\n# Informal Introduction\n\n[Mathematical optimization](https://en.wikipedia.org/wiki/Mathematical_optimization)\nis a well-studied language of expressing solutions to many real-life problems\nthat come up in machine learning and many other fields such as mechanics,\neconomics, EE, operations research, control engineering, geophysics,\nand molecular modeling.\nAs we build our machine learning systems to interact with real\ndata from these fields, we often **cannot** (but sometimes can)\nsimply ``learn away'' the optimization sub-problems by adding more\nlayers in our network. Well-defined optimization problems may be added\nif you have a thorough understanding of your feature space, but\noftentimes we **don't** have this understanding and resort to\nautomatic feature learning for our tasks.\n\nUntil this repository, **no** modern deep learning library has provided\na way of adding a learnable optimization layer (other than simply unrolling\nan optimization procedure, which is inefficient and inexact) into\nour model formulation that we can quickly try to see if it's a nice way\nof expressing our data.\n\nSee our paper\n[OptNet: Differentiable Optimization as a Layer in Neural Networks](https://arxiv.org/abs/1703.00443)\nand code at\n[locuslab/optnet](https://github.com/locuslab/optnet)\nif you are interested in learning more about our initial exploration\nin this space of automatically learning quadratic program layers\nfor signal denoising and sudoku.\n\n## Setup and Dependencies\n\n+ Python/numpy/[PyTorch](https://pytorch.org)\n+ [qpth](https://github.com/locuslab/qpth):\n  *Our fast QP solver for PyTorch released in conjunction with this paper.*\n+ [bamos/block](https://github.com/bamos/block):\n  *Our intelligent block matrix library for numpy, PyTorch, and beyond.*\n+ Optional: [bamos/setGPU](https://github.com/bamos/setGPU):\n  A small library to set `CUDA_VISIBLE_DEVICES` on multi-GPU systems.\n\n# Denoising Experiments\n\n```\ndenoising\n├── create.py - Script to create the denoising dataset.\n├── plot.py - Plot the results from any experiment.\n├── main.py - Run the FC baseline and OptNet denoising experiments. (See arguments.)\n├── main.tv.py - Run the TV baseline denoising experiment.\n└── run-exps.sh - Run all experiments. (May need to uncomment some lines.)\n```\n\n# Sudoku Experiments\n\n+ The dataset we used in our experiments is available in `sudoku/data`.\n\n```\nsudoku\n├── create.py - Script to create the dataset.\n├── plot.py - Plot the results from any experiment.\n├── main.py - Run the FC baseline and OptNet Sudoku experiments. (See arguments.)\n└── models.py - Models used for Sudoku.\n```\n\n# Classification Experiments\n\n```\ncls\n├── train.py - Run the FC baseline and OptNet classification experiments. (See arguments.)\n├── plot.py - Plot the results from any experiment.\n└── models.py - Models used for classification.\n```\n\n# Acknowledgments\n\nThe rapid development of this work would not have been possible without\nthe immense amount of help from the [PyTorch](https://pytorch.org) team,\nparticularly [Soumith Chintala](http://soumith.ch/) and\n[Adam Paszke](https://github.com/apaszke).\n\n# Licensing\n\nUnless otherwise stated, the source code is copyright\nCarnegie Mellon University and licensed under the\n[Apache 2.0 License](./LICENSE).\n","funding_links":[],"categories":["Papers and Codes","Paper implementations｜论文实现","Paper implementations","Python"],"sub_categories":["Other libraries｜其他库:","Other libraries:"],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Flocuslab%2Foptnet","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Flocuslab%2Foptnet","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Flocuslab%2Foptnet/lists"}