{"id":24388860,"url":"https://github.com/igorjakus/nomad","last_synced_at":"2025-07-16T17:33:18.237Z","repository":{"id":268366006,"uuid":"903357639","full_name":"igorjakus/nomad","owner":"igorjakus","description":"Numerical Optimization, Mathematics and Automatic Differentiation library in OCaml!🐫","archived":false,"fork":false,"pushed_at":"2025-01-29T23:14:27.000Z","size":672,"stargazers_count":1,"open_issues_count":0,"forks_count":0,"subscribers_count":1,"default_branch":"main","last_synced_at":"2025-07-06T14:42:38.439Z","etag":null,"topics":["automatic-differentiation","gradient-descent","mathematical-expressions","mathematics","numerical-methods","numerical-optimization","ocaml"],"latest_commit_sha":null,"homepage":"","language":"OCaml","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"mit","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/igorjakus.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null,"zenodo":null}},"created_at":"2024-12-14T11:58:01.000Z","updated_at":"2025-01-29T23:14:31.000Z","dependencies_parsed_at":"2025-07-06T14:37:42.717Z","dependency_job_id":null,"html_url":"https://github.com/igorjakus/nomad","commit_stats":null,"previous_names":["igorjakus/nnomad","igorjakus/nomad"],"tags_count":0,"template":false,"template_full_name":null,"purl":"pkg:github/igorjakus/nomad","repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/igorjakus%2Fnomad","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/igorjakus%2Fnomad/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/igorjakus%2Fnomad/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/igorjakus%2Fnomad/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/igorjakus","download_url":"https://codeload.github.com/igorjakus/nomad/tar.gz/refs/heads/main","sbom_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/igorjakus%2Fnomad/sbom","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":265527565,"owners_count":23782480,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["automatic-differentiation","gradient-descent","mathematical-expressions","mathematics","numerical-methods","numerical-optimization","ocaml"],"created_at":"2025-01-19T14:57:55.113Z","updated_at":"2025-07-16T17:33:18.222Z","avatar_url":"https://github.com/igorjakus.png","language":"OCaml","readme":"# Nomad\n\n![License](https://img.shields.io/badge/license-MIT-blue.svg)\n![OCaml](https://img.shields.io/badge/OCaml-4.14.2%2B-blue.svg)\n![Dune](https://img.shields.io/badge/Dune-3.17.0-blue.svg)\n\n**N**umerical **O**ptimization, **M**athematics and **A**utomatic **D**ifferentiation library in OCaml!🐫\n\nNomad is a comprehensive OCaml library designed for symbolic expression manipulation and optimization. It provides functionalities for evaluating expressions, computing derivatives, and performing various optimization algorithms such as Gradient Descent, Newton-Raphson, and the Bisection method. This project serves as a practical assignment for the Functional Programming course at the University of Wrocław.\n\n## Table of Contents\n\n- [Features](#features)\n- [Requirements](#requirements)\n- [Installation](#installation)\n- [Usage](#usage)\n- [Testing](#testing)\n- [Contributing](#contributing)\n- [License](#license)\n- [Acknowledgments](#acknowledgments)\n\n## Features\n\n- **Symbolic Expression Handling**\n  - Define and manipulate mathematical expressions symbolically.\n  - Simplify expressions using algebraic rules.\n  - Convert expressions to their string representations.\n\n- **Evaluation and Environment Management**\n  - Evaluate expressions given a set of variable assignments.\n  - Update and manage environments for variable values.\n\n- **Derivatives and Gradients**\n  - Compute first and higher-order derivatives of expressions.\n  - Calculate gradients for multivariable functions.\n\n- **Optimization Algorithms**\n  - **Gradient Descent**: Optimize expressions by iteratively moving towards the minimum.\n  - **Newton-Raphson**: Solve equations using the Newton-Raphson method.\n  - **Bisection Method**: Find roots of continuous functions.\n\n- **Comprehensive Testing**\n  - Robust test suite to ensure correctness of all functionalities.\n\n## Requirements\n\n- **OCaml**: Version 4.14.2 or newer\n- **Dune**: Version 3.17.0 or newer\n\n## Installation\n\n1. **Install Dune**\n```bash\nopam install dune.3.17.0\n```\n\n2. **Clone the Repository**\n```bash\ngit clone https://github.com/igorjakus/nomad.git\ncd nomad\n```\n\n3. **Build project**\n```bash\ndune build\n```\n\n## Usage\n\n### Evaluating Expressions\n\n```ocaml\nopen Nomad.Expr\nopen Nomad.Eval\nlet x, y = Var \"x\", Var \"y\"\n\nlet env = [(\"x\", 2.0); (\"y\", 3.0)]\nlet expression = x +: y\nlet result = eval env expression\n(* result = 5.0 *)\n```\n\n### Simplifying Expressions\n\n```ocaml\nopen Nomad.Expr\nopen Nomad.Simplify\n\nlet expr = x +: Float 0.\nlet simplified = simplify expr\n(* simplified = x *)\n```\n\n### Computing Derivatives\n\n```ocaml\nopen Nomad.Derivatives\n\nlet expr = x *: x\nlet derivative = derivative \"x\" expr\n(* derivative = Float 2. *: x *)\n```\n\n### Performing Gradient Descent\n\n```ocaml\nopen Nomad.Gradient_descent\n\nlet expr = x *: x +: y *: y\nlet initial_env = [(\"x\", 1.0); (\"y\", 1.0)]\nlet learning_rate = 0.1\nlet result = gradient_descent ~expr ~env:initial_env ~learning_rate ~max_iter:100\n```\n\n## Testing\n\nNomad includes a comprehensive test suite to validate its functionalities. To run the tests, execute:\n\n```bash\ndune runtest\n```\n\nThis will execute all test cases defined in the test.ml file, ensuring that each component behaves as expected.\n\n## Contributing\n\nContributions are welcome! Please submit issues and pull requests for any enhancements or bug fixes.\n\n1. Fork the repository.\n2. Create a new branch: git checkout -b feature/YourFeature\n3. Commit your changes: git commit -m 'Add some feature'\n4. Push to the branch: git push origin feature/YourFeature\n5. Submit a pull request.\n\n## License\n\nThis project is licensed under the MIT License. See the [LICENSE](LICENSE) file for details.\n\n## Acknowledgments\n\n- Developed as part of the Functional Programming course at the University of Wrocław.\n- Inspired by various symbolic mathematics and optimization libraries.","funding_links":[],"categories":[],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Figorjakus%2Fnomad","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Figorjakus%2Fnomad","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Figorjakus%2Fnomad/lists"}