{"id":33003017,"url":"https://github.com/auto-differentiation/xad","last_synced_at":"2026-02-06T23:11:49.724Z","repository":{"id":43743801,"uuid":"511549872","full_name":"auto-differentiation/xad","owner":"auto-differentiation","description":"Powerful automatic differentiation in C++ and Python","archived":false,"fork":false,"pushed_at":"2025-10-27T07:17:55.000Z","size":1520,"stargazers_count":386,"open_issues_count":6,"forks_count":47,"subscribers_count":12,"default_branch":"main","last_synced_at":"2025-11-13T03:02:54.675Z","etag":null,"topics":["automatic-differentiation","biotechnology","computer-graphics","derivatives","machine-learning","meteorology","numerical-analysis","optimisation","quant-finance","risk-management","robotics","scientific-computing"],"latest_commit_sha":null,"homepage":"https://auto-differentiation.github.io","language":"C++","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"agpl-3.0","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/auto-differentiation.png","metadata":{"files":{"readme":"README.md","changelog":"CHANGELOG.md","contributing":"CONTRIBUTING.md","funding":null,"license":"LICENSE.md","code_of_conduct":"CODE_OF_CONDUCT.md","threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":".github/SECURITY.md","support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null,"zenodo":null,"notice":null,"maintainers":null,"copyright":null,"agents":null,"dco":null,"cla":null}},"created_at":"2022-07-07T14:00:21.000Z","updated_at":"2025-11-03T07:28:57.000Z","dependencies_parsed_at":"2024-01-16T02:47:07.404Z","dependency_job_id":"92474ee3-4784-428b-bc7c-c78bd1194780","html_url":"https://github.com/auto-differentiation/xad","commit_stats":{"total_commits":171,"total_committers":10,"mean_commits":17.1,"dds":0.6374269005847953,"last_synced_commit":"f6ed5bbebbdbe2315271997ecddf56b8b203ba5b"},"previous_names":["auto-differentiation/xad","xcelerit/xad"],"tags_count":12,"template":false,"template_full_name":null,"purl":"pkg:github/auto-differentiation/xad","repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/auto-differentiation%2Fxad","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/auto-differentiation%2Fxad/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/auto-differentiation%2Fxad/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/auto-differentiation%2Fxad/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/auto-differentiation","download_url":"https://codeload.github.com/auto-differentiation/xad/tar.gz/refs/heads/main","sbom_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/auto-differentiation%2Fxad/sbom","scorecard":null,"host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":284580580,"owners_count":27029575,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","status":"online","status_checked_at":"2025-11-15T02:00:06.050Z","response_time":57,"last_error":null,"robots_txt_status":"success","robots_txt_updated_at":"2025-07-24T06:49:26.215Z","robots_txt_url":"https://github.com/robots.txt","online":true,"can_crawl_api":true,"host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["automatic-differentiation","biotechnology","computer-graphics","derivatives","machine-learning","meteorology","numerical-analysis","optimisation","quant-finance","risk-management","robotics","scientific-computing"],"created_at":"2025-11-13T14:00:38.907Z","updated_at":"2026-02-06T23:11:49.717Z","avatar_url":"https://github.com/auto-differentiation.png","language":"C++","readme":"\u003cp align=\"center\"\u003e\n  \u003ca href=\"https://auto-differentiation.github.io\" target=\"_blank\"\u003e\n    \u003cimg src=\"https://auto-differentiation.github.io/images/logo.svg\" height=\"80\" alt=\"XAD\"\u003e\n  \u003c/a\u003e\n\u003c/p\u003e\n\n# XAD: Fast, easy automatic differentiation in C++\n\nXAD is a high-performance C++ automatic differentiation library designed for large-scale, performance-critical systems.\n\nIt provides forward and adjoint (reverse) mode automatic differentiation via operator overloading, with a strong focus on:\n\n* Low runtime overhead\n* Minimal memory footprint\n* Straightforward integration into existing C++ codebases\n\nFor Monte Carlo and other repetitive workloads, XAD also offers optional JIT backend support,\nenabling record-once / replay-many execution for additional performance boost.\n\n\u003cp align=\"center\" dir=\"auto\"\u003e\n    \u003ca href=\"https://github.com/auto-differentiation/xad/releases/latest\"\u003e\n        \u003cimg src=\"https://img.shields.io/github/v/release/auto-differentiation/xad?label=Download\u0026sort=semver\" alt=\"Download\" style=\"max-width: 100%;\"\u003e\n    \u003c/a\u003e\n    \u003ca href=\"https://github.com/auto-differentiation/xad/blob/main/CONTRIBUTING.md\"\u003e\n        \u003cimg src=\"https://img.shields.io/badge/PRs%20-welcome-brightgreen.svg\" alt=\"PRs Welcome\" style=\"max-width: 100%;\"\u003e\n    \u003c/a\u003e\n    \u003ca href=\"https://github.com/auto-differentiation/xad/actions/workflows/ci.yml\"\u003e\n        \u003cimg src=\"https://img.shields.io/github/actions/workflow/status/auto-differentiation/xad/ci.yml?label=Build\u0026logo\" alt=\"Build Status\" style=\"max-width: 100%;\"\u003e\n    \u003c/a\u003e\n    \u003ca href=\"https://coveralls.io/github/auto-differentiation/xad?branch=main\"\u003e\n        \u003cimg src=\"https://coveralls.io/repos/github/auto-differentiation/xad/badge.svg?branch=main\" alt=\"Coverage\" style=\"max-width: 100%;\"\u003e\n    \u003c/a\u003e\n    \u003ca href=\"https://www.codacy.com/gh/auto-differentiation/xad/dashboard\"\u003e\n        \u003cimg src=\"https://img.shields.io/codacy/grade/1826d0a6c8ce4feb81ef3b482d65c7b4?logo=codacy\u0026label=Quality%20%28Codacy%29\" alt=\"Codacy Quality\" style=\"max-width: 100%;\"\u003e\n    \u003c/a\u003e\n\u003c/p\u003e\n\n\n## Key Features\n\n- **Forward \u0026 Reverse (Adjoint) Mode**: Supports any order using operator overloading.\n- **Vector mode**: Compute multiple derivatives at once.\n- **Checkpointing Support**: Efficient tape memory management for large-scale applications.\n- **External Function Interface**: Seamlessly connect with external libraries.\n- **Eigen support**: Works with the popular linear algebra library [Eigen](https://eigen.tuxfamily.org/index.php?title=Main_Page).\n- **JIT Backend Support** *(optional)*: Infrastructure for pluggable JIT backends, enabling record-once/replay-many.\n  workflows - with or without automatic differentiation. See [samples/jit_tutorial](samples/jit_tutorial).\n\n## Example\n\nCalculate first-order derivatives of an arbitrary function with two inputs and one output using XAD in adjoint mode.\n\n```c++\nAdouble x0 = 1.3;              // initialise inputs\nAdouble x1 = 5.2;\ntape.registerInput(x0);        // register independent variables\ntape.registerInput(x1);        // with the tape\ntape.newRecording();           // start recording derivatives\nAdouble y = func(x0, x1);      // run main function\ntape.registerOutput(y);        // register the output variable\nderivative(y) = 1.0;           // seed output adjoint to 1.0\ntape.computeAdjoints();        // roll back adjoints to inputs\ncout \u003c\u003c \"dy/dx0=\" \u003c\u003c derivative(x0) \u003c\u003c \"\\n\"\n     \u003c\u003c \"dy/dx1=\" \u003c\u003c derivative(x1) \u003c\u003c \"\\n\";\n```\n\n## Getting Started\n\nBuild XAD from source using CMake:\n\n```bash\ngit clone https://github.com/auto-differentiation/xad.git\ncd xad\nmkdir build\ncd build\ncmake ..\nmake\n```\n\nFor more detailed guides,\nrefer to our [**Installation Guide**](https://auto-differentiation.github.io/installation/cxx/)\nand explore [**Tutorials**](https://auto-differentiation.github.io/tutorials/).\n\n## Documentation\n\nFull documentation, including API reference and usage examples, is available at:\n[**https://auto-differentiation.github.io/**](https://auto-differentiation.github.io/)\n\n## Contributing\n\nContributions are welcome. Please see the\n[**Contributing Guide**](CONTRIBUTING.md) for details, and feel free to start a\ndiscussion in our\n[**GitHub Discussions**](https://github.com/auto-differentiation/xad/discussions).\n\n## Found a Bug?\n\nPlease report bugs and issues via the\n[**GitHub Issue Tracker**](https://github.com/auto-differentiation/xad/issues).\n\n## Related Projects\n\n- [XAD-Py](https://github.com/auto-differentiation/xad-py): XAD in Python.\n- [QuantLibAAD](https://github.com/auto-differentiation/QuantLibAAD): AAD integration in [QuantLib](https://github.com/lballabio/QuantLib).\n- [xad-forge](https://github.com/da-roth/xad-forge): Forge JIT backends for XAD.\n","funding_links":[],"categories":["Math","CPP"],"sub_categories":["Data Visualization"],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fauto-differentiation%2Fxad","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fauto-differentiation%2Fxad","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fauto-differentiation%2Fxad/lists"}