https://github.com/auto-differentiation/xad
Powerful automatic differentiation in C++ and Python
https://github.com/auto-differentiation/xad
automatic-differentiation biotechnology computer-graphics derivatives machine-learning meteorology numerical-analysis optimisation quant-finance risk-management robotics scientific-computing
Last synced: 1 day ago
JSON representation
Powerful automatic differentiation in C++ and Python
- Host: GitHub
- URL: https://github.com/auto-differentiation/xad
- Owner: auto-differentiation
- License: agpl-3.0
- Created: 2022-07-07T14:00:21.000Z (over 3 years ago)
- Default Branch: main
- Last Pushed: 2025-10-27T07:17:55.000Z (21 days ago)
- Last Synced: 2025-11-13T03:02:54.675Z (4 days ago)
- Topics: automatic-differentiation, biotechnology, computer-graphics, derivatives, machine-learning, meteorology, numerical-analysis, optimisation, quant-finance, risk-management, robotics, scientific-computing
- Language: C++
- Homepage: https://auto-differentiation.github.io
- Size: 1.45 MB
- Stars: 386
- Watchers: 12
- Forks: 47
- Open Issues: 6
-
Metadata Files:
- Readme: README.md
- Changelog: CHANGELOG.md
- Contributing: CONTRIBUTING.md
- License: LICENSE.md
- Code of conduct: CODE_OF_CONDUCT.md
- Security: .github/SECURITY.md
Awesome Lists containing this project
- fucking-awesome-cpp - XAD - Powerful Automatic Differentiation for C++. [AGPL] ๐ [website](auto-differentiation.github.io/) (Math)
- awesome-quant - XAD - Automatic Differentation (AAD) Library (CPP / Data Visualization)
- awesome-cpp - XAD - Powerful Automatic Differentiation for C++. [AGPL] [website](https://auto-differentiation.github.io/) (Math)
README
# ๐ XAD: Powerful Automatic Differentiation for C++ & Python
**XAD** is the ultimate solution for automatic differentiation, combining **ease of use** with **high performance**.
It's designed to help you differentiate complex applications with speed and precisionโwhether
you're optimizing neural networks, solving scientific problems, or performing financial risk analysis.
## ๐ Why XAD?
XAD is trusted by professionals for its **speed**, **flexibility**, and **scalability** across various fields:
- **Machine Learning & Deep Learning**: Accelerate neural network training and model optimization.
- **Optimization in Engineering & Finance**: Solve complex problems with high precision.
- **Numerical Analysis**: Improve methods for solving differential equations efficiently.
- **Scientific Computing**: Simulate physical systems and processes with precision.
- **Risk Management & Quantitative Finance**: Assess and hedge risks in sophisticated financial models.
- **Computer Graphics**: Optimize rendering algorithms for high-quality graphics.
- **Robotics**: Enhance control and simulation for robotic systems.
- **Meteorology**: Improve accuracy in weather prediction models.
- **Biotechnology**: Model complex biological processes effectively.
### Key Features
- **Forward & Adjoint Mode**: Supports any order using operator overloading.
- **Vector modes**: Compute multiple derivatives at once.
- **Checkpointing Support**: Efficient tape memory management for large-scale applications.
- **External Function Interface**: Seamlessly connect with external libraries.
- **Thread-Safe Tape**: Ensure safe, concurrent operations.
- **High Performance**: Optimized for speed and efficiency.
- **Proven in Production**: Battle-tested in large-scale, mission-critical systems.
- **Exception-Safe**: Formal guarantees for stability and error handling.
- **Eigen support**: Works with the popular linear algebra library [Eigen](https://eigen.tuxfamily.org/index.php?title=Main_Page)
## ๐ป Example
Calculate first-order derivatives of an arbitrary function with two inputs and one output using XAD in adjoint mode.
```c++
Adouble x0 = 1.3; // initialise inputs
Adouble x1 = 5.2;
tape.registerInput(x0); // register independent variables
tape.registerInput(x1); // with the tape
tape.newRecording(); // start recording derivatives
Adouble y = func(x0, x1); // run main function
tape.registerOutput(y); // register the output variable
derivative(y) = 1.0; // seed output adjoint to 1.0
tape.computeAdjoints(); // roll back adjoints to inputs
cout << "dy/dx0=" << derivative(x0) << "\n"
<< "dy/dx1=" << derivative(x1) << "\n";
```
## ๐ Getting Started
```bash
git clone https://github.com/auto-differentiation/xad.git
cd xad
mkdir build
cd build
cmake ..
make
```
For more detailed guides,
refer to our [**Installation Guide**](https://auto-differentiation.github.io/installation/)
and explore [**Tutorials**](https://auto-differentiation.github.io/tutorials/).
## ๐ค Contributing
Want to get involved? We welcome contributors from all backgrounds! Check out
our [**Contributing Guide**](CONTRIBUTING.md) and join the conversation in our
[**Discussions**](https://github.com/auto-differentiation/xad/discussions).
## ๐ Found a Bug?
Please report any issues through our
[**Issue Tracker**](https://github.com/auto-differentiation/xad/issues).
---
## ๐ฆ Related Projects
- [XAD-Py](https://github.com/auto-differentiation/xad-py): XAD in Python.
- [QuantLib-Risks](https://github.com/auto-differentiation/QuantLib-Risks-Cpp): Fast
risk evaluations in C++ and Python.
---