{"id":24098451,"url":"https://github.com/matlab-deep-learning/constrained-deep-learning","last_synced_at":"2025-05-07T19:24:55.916Z","repository":{"id":229692364,"uuid":"777369010","full_name":"matlab-deep-learning/constrained-deep-learning","owner":"matlab-deep-learning","description":"Constrained deep learning is an advanced approach to training deep neural networks by incorporating domain-specific constraints into the learning process.","archived":false,"fork":false,"pushed_at":"2025-04-28T11:39:02.000Z","size":15612,"stargazers_count":51,"open_issues_count":1,"forks_count":3,"subscribers_count":7,"default_branch":"main","last_synced_at":"2025-04-28T12:45:45.291Z","etag":null,"topics":["ai-verification","convex","convex-neural-network","deep-learning","deep-learning-algorithms","lipschitz","lipschitz-network","monotonic","monotonicity","neural-networks"],"latest_commit_sha":null,"homepage":"","language":"MATLAB","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"other","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/matlab-deep-learning.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":"LICENSE.txt","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":"SECURITY.md","support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null,"zenodo":null}},"created_at":"2024-03-25T18:11:35.000Z","updated_at":"2025-04-28T11:39:05.000Z","dependencies_parsed_at":null,"dependency_job_id":"5c2beee8-74f1-4b97-85cc-8e0ed93a3fc7","html_url":"https://github.com/matlab-deep-learning/constrained-deep-learning","commit_stats":null,"previous_names":["matlab-deep-learning/constrained-deep-learning"],"tags_count":4,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/matlab-deep-learning%2Fconstrained-deep-learning","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/matlab-deep-learning%2Fconstrained-deep-learning/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/matlab-deep-learning%2Fconstrained-deep-learning/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/matlab-deep-learning%2Fconstrained-deep-learning/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/matlab-deep-learning","download_url":"https://codeload.github.com/matlab-deep-learning/constrained-deep-learning/tar.gz/refs/heads/main","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":252942315,"owners_count":21829047,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["ai-verification","convex","convex-neural-network","deep-learning","deep-learning-algorithms","lipschitz","lipschitz-network","monotonic","monotonicity","neural-networks"],"created_at":"2025-01-10T14:45:55.848Z","updated_at":"2025-05-07T19:24:55.904Z","avatar_url":"https://github.com/matlab-deep-learning.png","language":"MATLAB","readme":"# AI Verification: Constrained Deep Learning\n\n[![Open in MATLAB\nOnline](https://www.mathworks.com/images/responsive/global/open-in-matlab-online.svg)](https://matlab.mathworks.com/open/github/v1?repo=matlab-deep-learning/constrained-deep-learning)\n\nConstrained deep learning is an advanced approach to training deep neural\nnetworks by incorporating domain-specific constraints into the learning process.\nBy integrating these constraints into the construction and training of neural\nnetworks, you can guarantee desirable behaviour in safety-critical scenarios\nwhere such guarantees are paramount.\n\nThis project aims to develop and evaluate deep learning models that adhere to\npredefined constraints, which could be in the form of physical laws, logical\nrules, or any other domain-specific knowledge. In the context of AI\nverification, constrained deep learning provides guarantees that certain\ndesirable properties are present in the trained neural network by design. These\ndesirable properties could include monotonicity, boundedness, and robustness\namongst others.\n\n\u003cfigure\u003e\n\u003cp align=\"center\"\u003e\n    \u003cimg src=\"./documentation/figures/constrained_learning.svg\"\u003e\n\u003c/p\u003e\n\u003c/figure\u003e\n\nBy bringing together the concepts of monotonicity, convexity, and Lipschitz\ncontinuity, this repository serves as a comprehensive resource for embedding\nessential constraints into deep learning models, addressing the complex needs of\nsafety-critical systems and fostering the convergence of theoretical principles\nwith practical AI verification applications.\n\nYou can learn more about monotonicity and Lipschitz continuity in the context of\naerospace applications in the \"Formal Methods Use for Learning Assurance\" report\nfrom EASA and Collins Aerospace [1].\n\n## Get Started\n\nDownload or clone this repository to your machine and open it in MATLAB\u0026reg;.\nAdd the conslearn directory and subfolders to the search path. Go to the\nlocation of the repository and run the command: `addpath(genpath(\"conslearn\"))`.\n\n### Requirements\n\n- [MATLAB](http://www.mathworks.com) R2024a or later\n- [Deep Learning\n  Toolbox\u0026trade;](https://www.mathworks.com/products/deep-learning.html)\n- [Parallel Computing\n  Toolbox\u0026trade;](https://uk.mathworks.com/products/parallel-computing.html)\n  (recommended)\n- [Optimization\n  Toolbox\u0026trade;](https://www.mathworks.com/products/optimization.html)\n- [Reinforcement Learning\n  Toolbox\u0026trade;](https://www.mathworks.com/products/reinforcement-learning.html)\n- [Image Processing\n  Toolbox\u0026trade;](https://www.mathworks.com/products/image-processing.html)\n- [Deep Learning Toolbox Verification\n  Library](https://uk.mathworks.com/products/deep-learning-verification-library.html)\n\n## Examples\n\nThe repository contains several introductory, interactive examples as well as\nlonger, real-world use case applications of constrained deep learning in the\ncontext of AI verification. In the same directory as the markdown files, you can\nfind the Live Script (MLX) file that you can open in MATLAB and run\ninteractively to work through the example.\n\n### Introductory Examples (Short)\n\nBelow are links for markdown versions of MATLAB Live Scripts that you can view\nin GitHub\u0026reg;.\n\n- [Fully input convex neural networks in\n  1-dimension](examples/convex/introductory/PoC_Ex1_1DFICNN.md)\n- [Fully input convex neural networks in\n  n-dimensions](examples/convex/introductory/PoC_Ex2_nDFICNN.md)\n- [Partially input convex neural networks in\n  n-dimensions](examples/convex/introductory/PoC_Ex3_nDPICNN.md)\n- [Fully input monotonic neural networks in\n  1-dimension](examples/monotonic/introductory/PoC_Ex1_1DFMNN.md)\n- [Fully input monotonic neural networks in\n  n-dimensions](examples/monotonic/introductory/PoC_Ex2_nDFMNN.md)\n- [Lipschitz continuous neural networks in\n  1-dimensions](examples/lipschitz/introductory/PoC_Ex1_1DLNN.md)\n\nThese examples make use of [custom training\nloops](https://uk.mathworks.com/help/deeplearning/deep-learning-custom-training-loops.html)\nand the\n[`arrayDatastore`](https://uk.mathworks.com/help/matlab/ref/matlab.io.datastore.arraydatastore.html)\nobject. To learn more, click the links.\n\n### Workflow Examples (Long)\n\n- [Dynamical System Modeling Using Convex Neural\nODE](examples/convex/neuralODE/TrainConvexNeuralODENetworkWithEulerODESolverExample.md)\nThis example works through the modeling of a dynamical system using a neural\nODE, where the underlying dynamics is captured by a fully input convex neural\nnetwork and the ODE solver uses a convex update method, for example, the Euler\nmethod. The example shows how the network is expressive enough to capture\nnonlinear dynamics and also provides boundedness guarantees on the solution\ntrajectories owing to the convex constraint of the underlying network and ODE\nsolver.\n\n- [Train Fully Convex Neural Networks for CIFAR-10 Image\nClassification](examples/convex/classificationCIFAR10/TrainICNNOnCIFAR10Example.md)\nThis example shows the expressive capabilities of fully convex networks by\nobtaining high training accuracy on image classification on the natural image\ndataset, CIFAR-10.\n\n- [Remaining Useful Life Estimation Using Monotonic Neural\nNetworks](examples/monotonic/RULEstimateUsingMonotonicNetworks/RULEstimationUsingMonotonicNetworksExample.md)\nThis example shows how to guarantee monotonic decreasing prediction on a\nremaining useful life (RUL) tasks by combining partially and fully monotonic\nnetworks. This example looks at predicting the RUL for turbofan engine\ndegradation.\n\n- [Battery State of Charge Estimation Using Monotonic Neural Networks](examples/monotonic/BSOCEstimateUsingMonotonicNetworks/BatteryStateOfChargeEstimationUsingMonotonicNeuralNetworks.md)\nThis example shows how to train two monotonic neural networks to estimate the state of charge (SOC) of a battery, one to model the charging behavior, and one to model the discharging behavior. In this example, you train the networks to predict the rate of change of the state of charge and force the output to be positive or negative for the charging and discharging networks, respectively. This way, you enforce monotonicity of the battery state of charge by constraining its derivative to be positive or negative.\n\n- [Train Image Classification Lipschitz Constrained Networks and Measure\nRobustness to Adversarial\nExamples](examples/lipschitz/classificationDigits/LipschitzClassificationNetworksRobustToAdversarialExamples.md)\nThis example shows how Lipschitz continuous constrained networks improve the\nrobustness of neural networks against adversarial attack. In this example, you\nuse formal verification methods to compute the number of robust images in the\ntest set against adversarial perturbation for several networks with decreasing\nupper bound Lipschitz constants. You find a smaller Lipschitz constant gives a\nmore robust classification network.\n\n## Functions\n\nThis repository introduces the following functions that are used throughout the examples:\n- [`buildConstrainedNetwork`](conslearn/buildConstrainedNetwork.m) - Build a multi-layer perceptron (MLP) with specific constraints on the architecture and initialization of the weights.\n- [`buildConvexCNN`](conslearn/buildConvexCNN.m) - Build a convolutional neural network (CNN) with convex constraints on the architecture and initialization of the weights.\n- [`trainConstrainedNetwork`](conslearn/trainConstrainedNetwork.m) - Train a constrained network and maintain the constraint during training.\n- [`lipschitzUpperBound`](conslearn/lipschitzUpperBound.m) - Compute an upper bound on the Lipschitz constant for a Lipschitz neural network.\n- [`convexNetworkOutputBounds`](conslearn/convexNetworkOutputBounds.m) - Compute guaranteed upper and lower bounds on hypercubic grids for convex networks.\n\n## Tests\n\nThis repository also contains tests for the software in the conslearn package.\n\nAs discussed in [1] (see 3.4.1.5), in certain situations, small violations in\nthe constraints may be admissible. For example, a small violation in\nmonotonicity may be admissible if the non-monotonic behaviour is kept below a\npre-defined threshold. In the system tests, you will see examples of tests that\nincorporate an admissibility constant. This can account for violations owing to\nfloating point error for instance.\n\n## Technical Articles\n\nThis repository focuses on the development and evaluation of deep learning\nmodels that adhere to constraints crucial for safety-critical applications, such\nas predictive maintenance for industrial machinery and equipment. Specifically,\nit focuses on enforcing monotonicity, convexity, and Lipschitz continuity within\nneural networks to ensure predictable and controlled behavior. By emphasizing\nconstraints like monotonicity, constrained neural networks ensure that\npredictions of the Remaining Useful Life (RUL) of components behave intuitively:\nas a machine's condition deteriorates, the estimated RUL should monotonically\ndecrease. This is crucial in applications like aerospace or manufacturing, where\nan accurate and reliable estimation of RUL can prevent failures and save costs.\nAlongside monotonicity, Lipschitz continuity is also enforced to guarantee model\nrobustness and controlled behavior. This is essential in environments where\nsafety and precision are paramount such as control systems in autonomous\nvehicles or precision equipment in healthcare. Convexity is especially\nbeneficial for control systems as it inherently provides boundedness properties.\nFor instance, by ensuring that the output of a neural network lies within a\nconvex hull, it is possible to guarantee that the control commands remain within\na safe and predefined operational space, preventing erratic or unsafe system\nbehaviors. This boundedness property, derived from the convex nature of the\nmodel's output space, is critical for maintaining the integrity and safety of\ncontrol systems under various conditions.\n\nThese technical articles explain key concepts of AI verification in the context\nof constrained deep learning. They include discussions on how to achieve the\nspecified constraints in neural networks at construction and training time, as\nwell as deriving and proving useful properties of constrained networks in AI\nverification applications. It is not necessary to go through these articles in\norder to explore this repository, however, you can find references and more in\ndepth discussion here.\n\n- [AI Verification: Monotonicity](documentation/AI-Verification-Monotonicity.md) -\n  Discussion on fully and partially monotonic neural networks and proveable\n  trends. This article introduces monotonic network architectures and\n  restrictions on weights to guarantee monotonic behaviour.\n- [AI Verification: Convexity](documentation/AI-Verification-Convexity.md) -\n  Discussion on fully and partially convex neural networks and proveable\n  guarantees of boundedness over hypercubic grids. This article contains proofs\n  on how to prove boundedness properties of a convex neural network on\n  hypercubic grids by analyzing the network and its derivative at the vertices.\n- [AI Verification: Lipschitz\n  Continuity](documentation/AI-Verification-Lipschitz.md) - Discussion on\n  Lipschitz continuous neural networks and proveable guarantees of robustness.\n  This article introduce Lipschitz continuity and how to compute an upper bound\n  on the Lipschitz constant for a set of network architectures.\n\n## References\n\n- [1] EASA and Collins Aerospace, Formal Methods use for Learning Assurance\n  (ForMuLA), April 2023,\n  \u003chttps://www.easa.europa.eu/en/newsroom-and-events/news/easa-and-collins-aerospace-release-joint-innovation-partnership-contract\u003e,\n  \u003chttps://www.easa.europa.eu/en/downloads/137878/en\u003e\n- [2] Amos, Brandon, et al. Input Convex Neural Networks. arXiv:1609.07152,\n  arXiv, 14 June 2017. arXiv.org, \u003chttps://doi.org/10.48550/arXiv.1609.07152\u003e.\n- [3] Gouk, Henry, et al. “Regularisation of Neural Networks by Enforcing\n  Lipschitz Continuity.” Machine Learning, vol. 110, no. 2, Feb. 2021, pp.\n  393–416. DOI.org (Crossref), \u003chttps://doi.org/10.1007/s10994-020-05929-w\u003e\n- [4] Kitouni, Ouail, et al. Expressive Monotonic Neural Networks.\n  arXiv:2307.07512, arXiv, 14 July 2023. arXiv.org,\n  \u003chttp://arxiv.org/abs/2307.07512\u003e.\n\nCopyright © 2024, The MathWorks, Inc.\n","funding_links":[],"categories":["Artificial Intelligence and Machine Learning"],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fmatlab-deep-learning%2Fconstrained-deep-learning","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fmatlab-deep-learning%2Fconstrained-deep-learning","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fmatlab-deep-learning%2Fconstrained-deep-learning/lists"}