https://github.com/deependujha/deeptensor
DeepTensor: A minimal PyTorch-like deep learning library focused on custom autograd and efficient tensor operations.
https://github.com/deependujha/deeptensor
autograd-engine computer-vision ddp deep-learning distributed-systems gpt gpt-2 neural-networks pytorch transformer
Last synced: 2 months ago
JSON representation
DeepTensor: A minimal PyTorch-like deep learning library focused on custom autograd and efficient tensor operations.
- Host: GitHub
- URL: https://github.com/deependujha/deeptensor
- Owner: deependujha
- License: mit
- Created: 2025-01-02T16:46:28.000Z (5 months ago)
- Default Branch: main
- Last Pushed: 2025-01-26T08:43:30.000Z (5 months ago)
- Last Synced: 2025-04-14T12:13:44.831Z (2 months ago)
- Topics: autograd-engine, computer-vision, ddp, deep-learning, distributed-systems, gpt, gpt-2, neural-networks, pytorch, transformer
- Language: C++
- Homepage: https://deependujha.github.io/DeepTensor/
- Size: 1.35 MB
- Stars: 3
- Watchers: 1
- Forks: 0
- Open Issues: 2
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# DeepTensor 🔥
![]()
![]()



![]()

- **`DeepTensor`**: A minimal PyTorch-like **deep learning library** focused on custom autograd and efficient tensor operations.
---
## **Features at a Glance** 🚀
- **Automatic gradient computation** with a custom autograd engine.
- **Weight initialization schemes**:
- `Xavier/Glorot` and `He` initialization in both `uniform` and `normal` variants.
- **Activation functions**:
- `ReLU`, `GeLU`, `Sigmoid`, `Tanh`, `SoftMax`, `LeakyReLU`, and more.
- **Built-in loss functions**:
- `Mean Squared Error (MSE)`, `Cross Entropy`, and `Binary Cross Entropy`.
- **Optimizers**:
- `SGD`, `Momentum`, `AdaGrad`, `RMSprop`, and `Adam`.---
### **Why DeepTensor?**
DeepTensor offers a hands-on implementation of deep learning fundamentals with a focus on **customizability** and **learning the internals** of deep learning frameworks like PyTorch.
---
## Installation
```bash
pip install deeptensor
```---
## Setup the project for development
```bash
git clone --recurse-submodules -j8 [email protected]:deependujha/DeepTensor.git
cd DeepTensor# run ctests
make ctest# install python package in editable mode
pip install -e .# run pytest
make test
```---
## Checkout Demo
- [play with latest demo](./demo/roboflow-demo.ipynb)

---
## Check Docs
- [visit docs](https://deependujha.github.io/DeepTensor)

---
## Basic Usage
```python
from deeptensor import (
# model
Model,# Layers
Conv2D,
MaxPooling2D,
Flatten,
LinearLayer,# activation layers
GeLu,
LeakyReLu,
ReLu,
Sigmoid,
SoftMax,
Tanh,# core objects
Tensor,
Value,# optimizers
SGD,
Momentum,
AdaGrad,
RMSprop,
Adam,# losses
mean_squared_error,
cross_entropy,
binary_cross_entropy,
)model = Model(
[
LinearLayer(2, 16),
ReLu(),
LinearLayer(16, 16),
LeakyReLu(0.1),
LinearLayer(16, 1),
Sigmoid(),
],
False, # using_cuda
)opt = Adam(model, 0.01) # learning rate
print(model)
tensor_input = Tensor([2])
tensor_input.set(0, Value(2.4))
tensor_input.set(1, Value(5.2))out = model(tensor_input)
loss = mean_squared_error(out, YOUR_EXPECTED_OUTPUT)
# backprop
loss.backward()
opt.step()
opt.zero_grad()
```---
## Features expected to be added
- Save & Load model
- Train a character-level transformer model
- Add support for DDP
- Add support for CUDA execution ⭐️---
## Open to Opportunities 🎅🏻🎁
I am actively seeking new opportunities to contribute to impactful projects in the deep learning and AI space.
If you are interested in collaborating or have a position that aligns with my expertise, feel free to reach out!
You can connect with me on [GitHub](https://github.com/deependujha), [X (formerly twitter)](https://x.com/deependu__), or email me: `[email protected]`.