https://github.com/ruhaan838/anygrad
A Tensor module that allows a deep learning framework to switch seamlessly between different engines.
https://github.com/ruhaan838/anygrad
ai autograd cpp20 deep-learning engine framework machine-learning python python3 tensor-algebra
Last synced: 5 months ago
JSON representation
A Tensor module that allows a deep learning framework to switch seamlessly between different engines.
- Host: GitHub
- URL: https://github.com/ruhaan838/anygrad
- Owner: Ruhaan838
- License: apache-2.0
- Created: 2024-12-25T16:04:12.000Z (over 1 year ago)
- Default Branch: master
- Last Pushed: 2025-04-18T05:02:15.000Z (12 months ago)
- Last Synced: 2025-05-15T02:13:50.303Z (11 months ago)
- Topics: ai, autograd, cpp20, deep-learning, engine, framework, machine-learning, python, python3, tensor-algebra
- Language: C++
- Homepage:
- Size: 386 KB
- Stars: 0
- Watchers: 1
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
- Code of conduct: CODE_OF_CONDUCT.md
Awesome Lists containing this project
README
# 🚂 AnyGrad: Flexible Engine for Tensor and Neural Network.
[](https://www.python.org/) [](https://pypi.org/project/anygrad/)
## Overview
AnyGrad is a simple tensor library that makes it easy to perform forward and backward passes. It uses a high-performance C++ backend together with a user-friendly Python frontend. You can change the backend easily and simply.
> Note: currently version `0.0.1` does not support any engine.
But in the future, the integrations of engines like `numpy`, `pytorch` etc. will come and you can use them for anything from Tensor operation to high-level transformer training.
## Installation
Install the library from PyPI:
```bash
pip install anygrad
```
If you'd like to work on the code:
```bash
git clone https://github.com/Ruhaan838/AnyGrad.git
./setup.sh
```
## Getting Started
### Creating a Tensor
Create tensors by importing the library and instantiating `Tensor`. By default, gradients are not tracked unless you enable them:
```python
import anygrad
# A tensor that does not calculate gradients
a = anygrad.Tensor([1, 2, 3])
# A tensor with gradient tracking enabled
b = anygrad.Tensor([2, 3, 4], requires_grad=True)
# A tensor with a specific data type (float64)
c = anygrad.Tensor([2, 3, 4], dtype=anygrad.float64)
```
> Other datatypes:
anygrad.int32
anygrad.int64
anygrad.bool
### Arithmetic Operations
#### Element-wise Operations
Perform calculations on tensors element by element:
```python
d = a + b # addition
d = a * d # multiplication
d = d / 10 # division
e = d - 10 # subtraction
```
#### Matrix Multiplication
You can multiply matrices in two ways:
```python
# Using the @ operator:
a = anygrad.ones((1, 2, 3), requires_grad=True)
b = anygrad.ones((2, 3, 4), requires_grad=True)
c = a @ b # tensor of shape (2, 2, 4)
# Or using the function:
c = anygrad.matmul(a, b)
```
### Gradient Calculation
AnyGrad automatically computes gradients, which you can access after running the backward pass:
```python
a = anygrad.Tensor([1, 2, 3], requires_grad=True)
b = anygrad.Tensor([2, 3, 4], requires_grad=True)
c = a * b
result = c.sum()
result.backward()
print(a.grad)
print(b.grad)
```
## Contributing
Contributions are welcome! Whether you want to improve performance or enhance the documentation, please open an issue or submit a pull request.
## License
This project is licensed under the terms outlined in the [LICENSE](LICENSE) file.