Ecosyste.ms: Awesome

An open API service indexing awesome lists of open source software.

Awesome Lists | Featured Topics | Projects

https://github.com/jegp/coordinate-regression

Coordinate regression with biologically realistic neural networks
https://github.com/jegp/coordinate-regression

coordinate-regression event-based-camera machine-learning spiking-neural-networks

Last synced: 11 days ago
JSON representation

Coordinate regression with biologically realistic neural networks

Awesome Lists containing this project

README

        

# Coordinate regression for event-based data

The repository demonstrates coordinate regression for event-based data with spiking neural networks.
Specifically, we contribute:

1. A dataset of event-based vision (EBV) videos for coordinate regression and pose estimation
2. A method for differentiable coordinate transform (DVS) for spiking neural networks
3. Translation-invariant receptive fields that outperforms similar artificial neural network models

## Usage

To train the models, follow the below steps

1. [Download the dataset via this link](https://kth-my.sharepoint.com/:u:/g/personal/jeped_ug_kth_se/EZS0BB9N5AlAo9uB9aq0ssYB1bnFNO7JDfv1LpQTqGAy7w?e=DoFiJZ) and unpack it to a folder you can recall, say `/tmp/eventdata`.
2. Ensure you have a Python installation with [PyTorch](https://pytorch.org) and [Norse](https://github.com/norse/norse) installed.
* After installing the necessary PyTorch version, you can install the dependencies from the `requirements.txt`-file by typing: `pip install -r requirements.txt`
3. Enter the `coordinate-regression` folder and run the `learn_shapes.py` file with the dataset directory and model type to start training
* As an example, run `python learn_shapes.py --data_root=/tmp/eventdata --model=snn`
* Four models are available: `ann`, `annsf`, `snn`, and `snnrf`
* For training parameter descriptions and help, type `python learn_shapes.py --help`

## Authors and Contact

* Jens E. Pedersen `` ([Twitter @jensegholm](https://twitter.com/jensegholm))
* Juan P. Romero B.
* Jörg Conradt

## Acknowledgements

This work has been performed at the
[Neurocomputing Systems Lab](https://neurocomputing.systems) at
[KTH Royal Institute of Technology](https://kth.se) and funded by the
[Human Brain Project](https://www.humanbrainproject.eu/) and the
[AI Pioneer Centre](https://www.aicentre.dk).

Please cite the work as follows:

```
@inproceedings{Pedersen_Singhal_Conradt_2023,
address={New York, NY, USA},
series={NICE ’23},
title={Translation and Scale Invariance for Event-Based Object tracking},
ISBN={978-1-4503-9947-0},
url={https://dl.acm.org/doi/10.1145/3584954.3584996},
DOI={10.1145/3584954.3584996},
booktitle={Proceedings of the 2023 Annual Neuro-Inspired Computational Elements Conference}, publisher={Association for Computing Machinery}, author={Pedersen, Jens Egholm and Singhal, Raghav and Conradt, Jorg},
year={2023},
month=apr,
pages={79–85},
collection={NICE ’23}
}
```

## License
This work is licensed under LGPLv3.