https://github.com/lanl/lca-pytorch
Sparse coding in PyTorch via the Locally Competitive Algorithm (LCA)
https://github.com/lanl/lca-pytorch
lca lcanets locally-competitive-algorithm pytorch sparse-coding
Last synced: about 2 months ago
JSON representation
Sparse coding in PyTorch via the Locally Competitive Algorithm (LCA)
- Host: GitHub
- URL: https://github.com/lanl/lca-pytorch
- Owner: lanl
- License: other
- Created: 2023-06-29T16:16:49.000Z (almost 2 years ago)
- Default Branch: main
- Last Pushed: 2025-03-14T20:35:24.000Z (3 months ago)
- Last Synced: 2025-04-12T11:08:41.903Z (about 2 months ago)
- Topics: lca, lcanets, locally-competitive-algorithm, pytorch, sparse-coding
- Language: Python
- Homepage:
- Size: 2.27 MB
- Stars: 8
- Watchers: 1
- Forks: 5
- Open Issues: 1
-
Metadata Files:
- Readme: README.md
- Contributing: CONTRIBUTING.md
- License: LICENSE.md
- Code of conduct: CODE_OF_CONDUCT.md
- Security: SECURITY.md
Awesome Lists containing this project
README
# PyTorch Implementation of the LCA Sparse Coding Algorithm
[](https://github.com/lanl/lca-pytorch/actions/workflows/build.yml)
[](https://codecov.io/gh/lanl/lca-pytorch)

[](https://github.com/psf/black)
[](https://opensource.org/licenses/BSD-3-Clause)LCA-PyTorch (lcapt) provides the ability to flexibly build single- or multi-layer convolutional sparse coding networks in PyTorch with the [Locally Competitive Algorithm (LCA)](https://bpb-us-e1.wpmucdn.com/blogs.rice.edu/dist/c/3448/files/2014/07/neco2008.pdf). LCA-Pytorch currently supports 1D, 2D, and 3D convolutional LCA layers, which maintain all the functionality and behavior of PyTorch convolutional layers. We currently do not support Linear (a.k.a. fully-connected) layers, but it is possible to implement the equivalent of a Linear layer with convolutions.
## Installation
### Dependencies
Required:
* Python (>= 3.8)Recommended:
* GPU(s) with NVIDIA CUDA (>= 11.0) and NVIDIA cuDNN (>= v7)### Pip Installation
```
pip install git+https://github.com/lanl/lca-pytorch.git
```### Manual Installation
```
git clone [email protected]:lanl/lca-pytorch.git
cd lca-pytorch
pip install .
```## Usage
LCA-PyTorch layers inherit all functionality of standard PyTorch layers.
```python
import torch
import torch.nn as nnfrom lcapt.lca import LCAConv2D
# create a dummy input
inputs = torch.zeros(1, 3, 32, 32)# 2D conv layer in PyTorch
pt_conv = nn.Conv2d(
in_channels=3,
out_channels=64,
kernel_size=7,
stride=2,
padding=3
)
pt_out = pt_conv(inputs)# 2D conv layer in LCA-PyTorch
lcapt_conv = LCAConv2D(
out_neurons=64,
in_neurons=3,
kernel_size=7,
stride=2,
pad='same'
)
lcapt_out = lcapt_conv(inputs)
```## Locally Competitive Algorithm (LCA)
LCA solves the $\ell_1$-penalized reconstruction problem
$\underset{a}\min \lvert|s - a * \Phi \rvert|_2^2 + \lambda \lvert| a \rvert|_1$
where $s$ is an input, $a$ is a sparse (i.e. mostly zeros) representation of $s$, $*$ is the convolution operation, $\Phi$ is a dictionary of convolutional features, $a * \Phi$ is the reconstruction of $s$, and $\lambda$ determines the tradeoff between reconstruction fidelity and sparsity in $a$. The equation above is convex in $a$, and LCA solves it by implementing a dynamical system of leaky integrate-and-fire neurons
$\dot{u}(t) = \frac{1}{\tau} \big[b(t) - u(t) - a(t) * G \big]$
in which each neuron's membrane potential, $u(t)$, is charged up or down by the bottom-up drive from the stimulus, $b(t) = s(t) * \Phi$ and is leaky via the term $-u(t)$. $u(t)$ can also be inhibited or excited by active surrounding neurons via the term $-a(t) * G$, where $a(t)=\Gamma_\lambda (u(t))$ is the neuron's activation computed by applying a firing threshold $\lambda$ to $u(t)$, and $G=\Phi * \Phi - I$. This means that a given neuron will modulate a neighboring neuron in proportion to the similarity between their receptive fields and how active it is at that time.
Below is a mapping between the variable names used in this implementation and those used in [Rozell et al.'s formulation](https://bpb-us-e1.wpmucdn.com/blogs.rice.edu/dist/c/3448/files/2014/07/neco2008.pdf) of LCA.
| **LCA-PyTorch Variable** | **Rozell Variable** | **Description** |
| --- | --- | --- |
| input_drive | $b(t)$ | Drive from the inputs/stimulus |
| states | $u(t)$ | Internal state/membrane potential |
| acts | $a(t)$ | Code/Representation/External Communication |
| lambda_ | $\lambda$ | Transfer function threshold value |
| weights | $\Phi$ | Dictionary/Features |
| inputs | $s(t)$ | Input data |
| recons | $\hat{s}(t)$ | Reconstruction of the input |
| tau | $\tau$ | LCA time constant |## Examples
* Dictionary Learning Using Built-In Update Method
* [Dictionary Learning on Cifar-10 Images](https://github.com/lanl/lca-pytorch/blob/main/examples/builtin_dictionary_learning_cifar.ipynb)
* [Fully-Connected Dictionary Learning on MNIST](https://github.com/lanl/lca-pytorch/blob/main/examples/builtin_dictionary_learning_mnist_fc.ipynb)
* Dictionary Learning Using PyTorch Optimizer
* [Dictionary Learning on Cifar-10 Images](https://github.com/lanl/lca-pytorch/blob/main/examples/pytorch_optim_dictionary_learning_cifar.ipynb)## License
LCA-PyTorch is provided under a BSD license with a "modifications must be indicated" clause. See [the LICENSE file](https://github.com/lanl/lca-pytorch/blob/main/LICENSE) for the full text. Internally, the LCA-PyTorch package is known as LA-CC-23-064.