Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/russelljjarvis/spikingneuralnetworks_.jl
Julia has enough tools to support fitting spiking neural network models to data. Pythons slow speed necessitates calls to external simulators (two language problem) to do network simulation. As much as possible it would be nice to do fast, efficient data fitting of spike trains to network models in one language, lets try to do that here.
https://github.com/russelljjarvis/spikingneuralnetworks_.jl
julia-language optimization simulation
Last synced: 6 days ago
JSON representation
Julia has enough tools to support fitting spiking neural network models to data. Pythons slow speed necessitates calls to external simulators (two language problem) to do network simulation. As much as possible it would be nice to do fast, efficient data fitting of spike trains to network models in one language, lets try to do that here.
- Host: GitHub
- URL: https://github.com/russelljjarvis/spikingneuralnetworks_.jl
- Owner: russelljjarvis
- License: mit
- Created: 2021-09-15T08:47:02.000Z (about 3 years ago)
- Default Branch: main
- Last Pushed: 2023-05-03T03:45:49.000Z (over 1 year ago)
- Last Synced: 2023-10-19T20:29:08.487Z (about 1 year ago)
- Topics: julia-language, optimization, simulation
- Language: Julia
- Homepage:
- Size: 1.37 MB
- Stars: 6
- Watchers: 0
- Forks: 5
- Open Issues: 11
-
Metadata Files:
- Readme: README.md
- Contributing: CONTRIBUTING.md
- License: LICENSE
Awesome Lists containing this project
README
Spiking Network Examples in Julia
Description •
Outputs •
Motivation •
Install •
Plans •
Flow Chart •
Why Not Optimize Small SNNs With Bigger SNNsJulia has enough tools to support fitting spiking neural network models to data. Python speed necessitates external simulators to do network simulation. As much as possible it would be nice to do fast, efficient data fitting of spike trains to network models in one language, lets try to do that here.
### Acknowledgements
A reduced Spiking neuronal model optimization package directly derived from https://github.com/AStupidBear/SpikingNeuralNetworks.jlThis one uses BindsNet.py as the backend, and Metahieristics.jl to optimize.
### Getting Started
Install the Julia module
This is not yet an official package, so the package would need to be added in developer mode. The short way to do this is as follows:
```
import Pkg
Pkg.add(url="https://github.com/russelljjarvis/SpikeNetOpt.jl.git")
```
or
```
] add https://github.com/russelljjarvis/SpikeNetOpt.jl.git
```
The long way invovles:
```
git clone https://github.com/russelljjarvis/SpikeNetOpt.jl
``````
cd SpikeNetOpt.jl
julia
]
(@v1.5) pkg> develop .
```
Or
```
Pkg.develop(PackageSpec(path=pwd()))```
Entry Points
Optimize a spiking neural network by exploring effect of parameter that controls connectome graph structure:
```
julia
include("examples/run_net_opt.jl")```
```
cd examples
julia run_net_opt.jl
```Single cell data fitting against spike times:
```
cd test
julia single_cell_opt_adexp.jl
julia single_cell_opt_izhi.jl
```### Motivation
Detailed Motivation and Previous work
(https://github.com/russelljjarvis/BluePyOpt/blob/neuronunit_reduced_cells/examples/neuronunit/OptimizationMulitSpikingIzhikevichModel.ipynb) in data-driven optimization of spiking neurons was implemented in Python. The Python implementation of reduced model simulation sometimes called external simulation, and overall my previous implementation of reduced model optimization was slower and more complex than it needed to be, for language and tool specific reasons.
Reduced model spiking neurons models have compact equations, and they should be fast to simulate, but Python often calls external codes and programes (C,C++,NEURON,brian2,NEST,PyNN) to achieve a speedup for network simulations, however, approaches for speeding up network simulations are not necessarily efficient or convenient for running single-cell simulations, as me be required for single cell optimizations. This strategy of calling external code causes an intolerable code complexity and intolerable run-time cost for single neuron simulations. The Python tool numba JIT partially remedies this problem, however, code from the Python optimization framework DEAP/BluePyOpt also induces an additional overhead. An almost pure Julia SNN optimization routine is a better solution to efficiently optimizing Reduced SNN models. In this package, two other packages: Evolutionary.jl, and Metaheuristics provide genetic algorithms used to optimize spiking neural networks.
#### Other Info
[A Google Doc presentation](https://docs.google.com/presentation/d/1bWA5LhgAD8D4MGPQxf5P6jtb0spVEGeJKyXCHnh-aq0/edit?usp=sharing) that sets up the motivation for the project.
[Part of BrainHack](https://brainhack.org/global2021/project/project_98/)#### Optimization Outputs
The loss function is constructed by computing Spike Distance between all pairs of neurons
Networks are optimized using pair wise spike-distance metric on each pair of neurons
Pythons NetworkUnit package is used to perform a posthoc evaluation of the optimized network.Example Outputs
See the figure below where local variation and firing rates are compared against every neuron between two model networks.
#### Network optimization
For example this is a ground truth model versus an optimized model t-test of firing rates:
```
Student's t-test
datasize: 200 200
t = 11.811 p value = 1.82e-25
```
#### Single Cell optimization
Note for perspective 86% of spike times are matched in some of the best, model fitting competitions.
Output from a single cell optimization:
Output from a Network Spike Time optimization (note that Unicode backend is the plotting method, and neuron synapses fire propabilistically):
#### Current Design Vs Intended Design Flow Chart:
Intended Future Design:
## Development Plans
#### DONE- [x] Used spike distance and genetic algorithms to optimize network spike raster activity.
- [x] Used pythons NetworkUnit to validate results and t-tests of results
- [x] Created single cell model fitting to Allen Brain Observatory Spike Train Data.
- [x] Ability to toggle between simulator backends (https://github.com/AStupidBear/SpikingNeuralNetworks.jl vs https://github.com/darsnack/SpikingNN.jl)#### TODO
- [ ] Learning 2 learn
- [ ] Implemented multi-processing of feature extraction/spike distance (sort of)
- [ ] Animation of Genetic Algorithm Convergence (sort of metaheuristics does this with minimal effort)
- [ ] ADAM-Opt predictions using evolved population see file mwe.jl.
- [ ] Read in and optimize against FPGA Event Stream Data AEDAT#### DONT DO
- [ ] Use large SNNs to optimize smaller SNNs themselves, as this would be parsimonious.#### Why Not Optimize Small SNNs With Bigger SNNs?
This is the long term intended approach, to use a recurrent Inhibitory population + Excitatory population
Network to optimize smaller networks.## Contributors ✨
Thanks goes to these wonderful people ([emoji key](https://allcontributors.org/docs/en/emoji-key)):
Russell Jarvis
💻 📖 🤔 🎨 🚇
Mohit Saxena
⚠️
Páll Haraldsson
📖
This project follows the [all-contributors](https://github.com/all-contributors/all-contributors) specification. Contributions of any kind welcome!