https://github.com/piras-s/tuningcurvesnestedbayesianinference
Bayesian inference of neural tuning curves using nested sampling (PyMultiNest), with theory, simulation, and diagnostic visualizations.
https://github.com/piras-s/tuningcurvesnestedbayesianinference
bayesian-inference data-visualization machine-learning model-evaluation nested-sampling neuroscience pymultinest python3 simulation
Last synced: 3 months ago
JSON representation
Bayesian inference of neural tuning curves using nested sampling (PyMultiNest), with theory, simulation, and diagnostic visualizations.
- Host: GitHub
- URL: https://github.com/piras-s/tuningcurvesnestedbayesianinference
- Owner: Piras-S
- License: gpl-3.0
- Created: 2025-03-27T08:45:57.000Z (9 months ago)
- Default Branch: main
- Last Pushed: 2025-03-27T09:14:17.000Z (9 months ago)
- Last Synced: 2025-09-05T13:57:31.572Z (4 months ago)
- Topics: bayesian-inference, data-visualization, machine-learning, model-evaluation, nested-sampling, neuroscience, pymultinest, python3, simulation
- Language: Jupyter Notebook
- Homepage:
- Size: 729 KB
- Stars: 0
- Watchers: 1
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# Bayesian Inference of Neural Tuning Curves with Nested Sampling
This project demonstrates how to simulate and analyze neural tuning curve data using **Bayesian inference** via **nested sampling**. It combines theory, code, and visual intuition to explain how posterior distributions are obtained, and why they should be interpreted with care.
---
## Project Overview
I simulate the firing response of a neuron to different stimulus angles (e.g., orientation of a visual stimulus), assuming a Gaussian-shaped tuning curve. Using synthetic data, I then perform **Bayesian parameter estimation** with PyMultiNest to recover the model parameters:
- $r_{\text{max}}$: maximum firing rate,
- $s_{\text{max}}$: preferred stimulus orientation,
- $\sigma_f$: tuning width.
---
## Concepts Covered
- Bayesian inference: likelihood, priors, posteriors, evidence
- Nested sampling algorithm (with theory + toy implementation)
- Parameter estimation in a neural tuning model
- Posterior uncertainty, model mismatch, and identifiability
- Practical diagnostics for PyMultiNest fits
---
## Visual Highlights
### Toy Nested Sampling Animation
A minimal 2D implementation of nested sampling illustrates the algorithm’s core idea: removing low-likelihood regions and progressively zooming in on high-probability space.
---
**Dependencies**:
numpy matplotlib pandas pymultinest corner imageio
**References**
Dayan, P., & Abbott, L. F. (2001). Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. MIT Press.
Feroz, F., Hobson, M. P., & Bridges, M. (2009). MultiNest: an efficient and robust Bayesian inference tool for cosmology and particle physics.
---
## Project Structure
```bash
├── simulation.ipynb # Generate and visualize synthetic data
├── fitting.ipynb # Educational toy example of nested sampling + Run and analyze full model fitting
├── run_pymultinest_fit.py # Script to perform inference with PyMultiNest
├── tuning_data.csv # Simulated dataset
├── figures/
├── chains/
└── nested_sampling.gif # Animation of toy nested sampling