Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/kyegomez/localsoftmax
My own implementation/experiments with a local softmax
https://github.com/kyegomez/localsoftmax
artificial-intelligence artificial-intelligence-algorithms artificial-neural-networks attention-mechanism softmax softmax-layer
Last synced: 7 days ago
JSON representation
My own implementation/experiments with a local softmax
- Host: GitHub
- URL: https://github.com/kyegomez/localsoftmax
- Owner: kyegomez
- License: mit
- Created: 2023-09-29T02:55:21.000Z (about 1 year ago)
- Default Branch: main
- Last Pushed: 2024-03-11T16:55:40.000Z (8 months ago)
- Last Synced: 2024-10-08T09:57:08.721Z (about 1 month ago)
- Topics: artificial-intelligence, artificial-intelligence-algorithms, artificial-neural-networks, attention-mechanism, softmax, softmax-layer
- Language: Python
- Homepage: https://discord.gg/qUtxnK2NMf
- Size: 216 KB
- Stars: 6
- Watchers: 2
- Forks: 0
- Open Issues: 4
-
Metadata Files:
- Readme: README.md
- Funding: .github/FUNDING.yml
- License: LICENSE
Awesome Lists containing this project
README
[![Multi-Modality](agorabanner.png)](https://discord.gg/qUtxnK2NMf)
# LocalSoftmax
Local Softmax parallelize the softmax computation by splitting the tensor into smaller sub-tensors and applying the softmax function on each of these smaller tensors independently. In other words, we want to compute a "local" softmax on each chunk of the tensor, instead of on the entire tensor.# Appreciation
* Lucidrains
* Agorians# Install
`pip install local-sfmx`## Usage
```python
import torch
from local_sfmx import local_softmaxtensor = torch.rand(10, 5)
result = local_softmax(tensor, 2)
print(result)
```# Algorithm
function LocalSoftmax(tensor, num_chunks):
split tensors into `num_chunks` smaller tensors
for each smaller tensor:
apply standard softmax
concatenate the results
return concatenated tensor# License
MIT