https://github.com/auto-differentiation/xad
Powerful automatic differentiation in C++ and Python
https://github.com/auto-differentiation/xad
automatic-differentiation biotechnology computer-graphics derivatives machine-learning meteorology numerical-analysis optimisation quant-finance risk-management robotics scientific-computing
Last synced: 13 days ago
JSON representation
Powerful automatic differentiation in C++ and Python
- Host: GitHub
- URL: https://github.com/auto-differentiation/xad
- Owner: auto-differentiation
- License: agpl-3.0
- Created: 2022-07-07T14:00:21.000Z (over 3 years ago)
- Default Branch: main
- Last Pushed: 2025-10-27T07:17:55.000Z (4 months ago)
- Last Synced: 2025-11-13T03:02:54.675Z (3 months ago)
- Topics: automatic-differentiation, biotechnology, computer-graphics, derivatives, machine-learning, meteorology, numerical-analysis, optimisation, quant-finance, risk-management, robotics, scientific-computing
- Language: C++
- Homepage: https://auto-differentiation.github.io
- Size: 1.45 MB
- Stars: 386
- Watchers: 12
- Forks: 47
- Open Issues: 6
-
Metadata Files:
- Readme: README.md
- Changelog: CHANGELOG.md
- Contributing: CONTRIBUTING.md
- License: LICENSE.md
- Code of conduct: CODE_OF_CONDUCT.md
- Security: .github/SECURITY.md
Awesome Lists containing this project
- fucking-awesome-cpp - XAD - Powerful Automatic Differentiation for C++. [AGPL] 🌎 [website](auto-differentiation.github.io/) (Math)
- awesome-quant - XAD - Automatic Differentation (AAD) Library (CPP / Data Visualization)
- awesome-cpp - XAD - Powerful Automatic Differentiation for C++. [AGPL] [website](https://auto-differentiation.github.io/) (Math)
README
# XAD: Fast, easy automatic differentiation in C++
XAD is a high-performance C++ automatic differentiation library designed for large-scale, performance-critical systems.
It provides forward and adjoint (reverse) mode automatic differentiation via operator overloading, with a strong focus on:
* Low runtime overhead
* Minimal memory footprint
* Straightforward integration into existing C++ codebases
For Monte Carlo and other repetitive workloads, XAD also offers optional JIT backend support,
enabling record-once / replay-many execution for additional performance boost.
## Key Features
- **Forward & Reverse (Adjoint) Mode**: Supports any order using operator overloading.
- **Vector mode**: Compute multiple derivatives at once.
- **Checkpointing Support**: Efficient tape memory management for large-scale applications.
- **External Function Interface**: Seamlessly connect with external libraries.
- **Eigen support**: Works with the popular linear algebra library [Eigen](https://eigen.tuxfamily.org/index.php?title=Main_Page).
- **JIT Backend Support** *(optional)*: Infrastructure for pluggable JIT backends, enabling record-once/replay-many.
workflows - with or without automatic differentiation. See [samples/jit_tutorial](samples/jit_tutorial).
## Example
Calculate first-order derivatives of an arbitrary function with two inputs and one output using XAD in adjoint mode.
```c++
Adouble x0 = 1.3; // initialise inputs
Adouble x1 = 5.2;
tape.registerInput(x0); // register independent variables
tape.registerInput(x1); // with the tape
tape.newRecording(); // start recording derivatives
Adouble y = func(x0, x1); // run main function
tape.registerOutput(y); // register the output variable
derivative(y) = 1.0; // seed output adjoint to 1.0
tape.computeAdjoints(); // roll back adjoints to inputs
cout << "dy/dx0=" << derivative(x0) << "\n"
<< "dy/dx1=" << derivative(x1) << "\n";
```
## Getting Started
Build XAD from source using CMake:
```bash
git clone https://github.com/auto-differentiation/xad.git
cd xad
mkdir build
cd build
cmake ..
make
```
For more detailed guides,
refer to our [**Installation Guide**](https://auto-differentiation.github.io/installation/cxx/)
and explore [**Tutorials**](https://auto-differentiation.github.io/tutorials/).
## Documentation
Full documentation, including API reference and usage examples, is available at:
[**https://auto-differentiation.github.io/**](https://auto-differentiation.github.io/)
## Contributing
Contributions are welcome. Please see the
[**Contributing Guide**](CONTRIBUTING.md) for details, and feel free to start a
discussion in our
[**GitHub Discussions**](https://github.com/auto-differentiation/xad/discussions).
## Found a Bug?
Please report bugs and issues via the
[**GitHub Issue Tracker**](https://github.com/auto-differentiation/xad/issues).
## Related Projects
- [XAD-Py](https://github.com/auto-differentiation/xad-py): XAD in Python.
- [QuantLibAAD](https://github.com/auto-differentiation/QuantLibAAD): AAD integration in [QuantLib](https://github.com/lballabio/QuantLib).
- [xad-forge](https://github.com/da-roth/xad-forge): Forge JIT backends for XAD.