{"id":26694474,"url":"https://github.com/deeplearnphysics/spine","last_synced_at":"2026-04-07T07:29:20.120Z","repository":{"id":242742090,"uuid":"778546563","full_name":"DeepLearnPhysics/spine","owner":"DeepLearnPhysics","description":"Scalable Particle Imaging with Neural Embeddings","archived":false,"fork":false,"pushed_at":"2026-04-05T21:14:46.000Z","size":8863,"stargazers_count":5,"open_issues_count":13,"forks_count":18,"subscribers_count":3,"default_branch":"main","last_synced_at":"2026-04-05T23:21:25.779Z","etag":null,"topics":["machine-learning","neutrino-physics","particle-imaging","reconstruction"],"latest_commit_sha":null,"homepage":"","language":"Python","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"mit","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/DeepLearnPhysics.png","metadata":{"files":{"readme":"README.md","changelog":"CHANGELOG.md","contributing":"CONTRIBUTING.md","funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null,"zenodo":null,"notice":null,"maintainers":null,"copyright":null,"agents":null,"dco":null,"cla":null}},"created_at":"2024-03-27T23:23:14.000Z","updated_at":"2026-04-05T21:14:54.000Z","dependencies_parsed_at":"2024-06-04T19:50:30.132Z","dependency_job_id":"1d4be8e3-b266-427e-b8d3-0904798f50af","html_url":"https://github.com/DeepLearnPhysics/spine","commit_stats":null,"previous_names":["deeplearnphysics/spine"],"tags_count":49,"template":false,"template_full_name":null,"purl":"pkg:github/DeepLearnPhysics/spine","repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/DeepLearnPhysics%2Fspine","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/DeepLearnPhysics%2Fspine/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/DeepLearnPhysics%2Fspine/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/DeepLearnPhysics%2Fspine/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/DeepLearnPhysics","download_url":"https://codeload.github.com/DeepLearnPhysics/spine/tar.gz/refs/heads/main","sbom_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/DeepLearnPhysics%2Fspine/sbom","scorecard":null,"host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":286080680,"owners_count":31504892,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2026-04-07T03:10:19.677Z","status":"ssl_error","status_checked_at":"2026-04-07T03:10:13.982Z","response_time":105,"last_error":"SSL_connect returned=1 errno=0 peeraddr=140.82.121.5:443 state=error: unexpected eof while reading","robots_txt_status":"success","robots_txt_updated_at":"2025-07-24T06:49:26.215Z","robots_txt_url":"https://github.com/robots.txt","online":false,"can_crawl_api":true,"host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["machine-learning","neutrino-physics","particle-imaging","reconstruction"],"created_at":"2025-03-26T18:29:41.422Z","updated_at":"2026-04-07T07:29:20.084Z","avatar_url":"https://github.com/DeepLearnPhysics.png","language":"Python","readme":"\u003ch1 align=\"center\"\u003e\n\u003cimg src=\"https://raw.githubusercontent.com/DeepLearnPhysics/spine/main/docs/source/_static/img/spine-logo-dark.png\" alt='SPINE', width=\"400\"\u003e\n\u003c/h1\u003e\u003cbr\u003e\n\n[![CI](https://github.com/DeepLearnPhysics/spine/actions/workflows/ci.yml/badge.svg)](https://github.com/DeepLearnPhysics/spine/actions/workflows/ci.yml)\n[![codecov](https://codecov.io/gh/DeepLearnPhysics/spine/branch/main/graph/badge.svg)](https://codecov.io/gh/DeepLearnPhysics/spine)\n[![Documentation Status](https://readthedocs.org/projects/spine/badge/?version=latest)](https://spine.readthedocs.io/latest/)\n[![PyPI version](https://badge.fury.io/py/spine.svg)](https://badge.fury.io/py/spine)\n[![Python version](https://img.shields.io/pypi/pyversions/spine.svg)](https://pypi.org/project/spine/)\n\nThe Scalable Particle Imaging with Neural Embeddings (SPINE) package leverages state-of-the-art Machine Learning (ML) algorithms -- in particular Deep Neural Networks (DNNs) -- to reconstruct particle imaging detector data. This package was primarily developed for Liquid Argon Time-Projection Chamber (LArTPC) data and relies on Convolutional Neural Networks (CNNs) for pixel-level feature extraction and Graph Neural Networks (GNNs) for superstructure formation. The schematic below breaks down the full end-to-end reconstruction flow.\n\n![Full chain](https://raw.githubusercontent.com/DeepLearnPhysics/spine/main/docs/source/_static/img/spine-chain-alpha.png)\n\n## Installation\n\nSPINE is now available on PyPI with flexible installation options to suit different needs:\n\n### Quick Start (Recommended)\n\nFor data analysis and visualization without machine learning:\n\n```bash\npip install spine[all]\n```\n\n### Installation Options\n\n**1. Core Package (minimal dependencies)**\n```bash\n# Essential dependencies: numpy, scipy, pandas, PyYAML, h5py, numba\npip install spine\n```\n\n**2. With Visualization Tools**\n```bash\n# Adds plotly, matplotlib, seaborn for data visualization\npip install spine[viz]\n```\n\n**3. Development Environment**\n```bash\n# Adds testing, formatting, and documentation tools\npip install spine[dev]\n```\n\n**4. Everything (except PyTorch)**\n```bash\n# All optional dependencies (visualization + development tools)\npip install spine[all]\n```\n\n### PyTorch ecosystem\n\n#### Option 1: Container Approach (Recommended)\n\nThe easiest way to get a working PyTorch environment with LArCV support:\n\n```bash\n# Pull the SPINE-compatible container with complete PyTorch ecosystem + LArCV\nsingularity pull spine.sif docker://deeplearnphysics/larcv2:ub2204-cu121-torch251-larndsim\n\n# Install SPINE in the container\nsingularity exec spine.sif pip install spine[all]\n\n# Run your analysis\nsingularity exec spine.sif spine --config your_config.cfg --source data.h5\n```\n\n\u003e This container includes: PyTorch 2.5.1, CUDA 12.1, torch-geometric, torch-scatter, torch-cluster, MinkowskiEngine, and **LArCV2**.\n\n#### Option 2: Manual Installation** (advanced users):\n```bash\n# Step 1: Install PyTorch with CUDA\npip install torch --index-url https://download.pytorch.org/whl/cu118\n\n# Step 2: Install ecosystem packages (critical order)\npip install --no-build-isolation torch-scatter torch-cluster torch-geometric MinkowskiEngine\n\n# Step 3: Install SPINE\npip install spine[all]\n```\n\n\u003e **� Why separate?** The PyTorch ecosystem (torch, torch-geometric, torch-scatter, torch-cluster, MinkowskiEngine) forms an interdependent group requiring exact version compatibility and complex compilation. Installing them together ensures compatibility.\n\n### LArCV2\n\n#### Option 1: Use the container (recommended)*\n```bash\n# LArCV2 is pre-installed in the DeepLearnPhysics container\nsingularity pull spine.sif docker://deeplearnphysics/larcv2:ub2204-cu121-torch251-larndsim\n```\n\n#### Option 2: Build from source*\n```bash\n# Clone and build the latest LArCV2\ngit clone https://github.com/DeepLearnPhysics/larcv2.git\ncd larcv2\n# Follow build instructions in the repository\n```\n\n\u003e **Note**: Avoid conda-forge larcv packages as they may be outdated. Use the container or build from the official source.\n\n### Development Installation\n\nFor developers who want to work with the source code:\n```bash\ngit clone https://github.com/DeepLearnPhysics/spine.git\ncd spine\npip install -e .[dev]\n```\n\n#### Quick Development Testing (No Installation)\n\nFor rapid development and testing without reinstalling the package:\n\n```bash\n# Clone the repository\ngit clone https://github.com/DeepLearnPhysics/spine.git\ncd spine\n\n# Install only the dependencies (not the package itself)\n# Or alternatively simple run the commands inside the above container\npip install numpy scipy pandas pyyaml h5py numba psutil\n\n# Run directly from source\npython src/spine/bin/run.py --config config/train_uresnet.cfg --source /path/to/data.h5\n\n# Or make it executable and run directly\nchmod +x src/spine/bin/run.py\n./src/spine/bin/run.py --config your_config.cfg --source data.h5\n```\n\n\u003e **💡 Development Tip**: This approach lets you test code changes immediately without reinstalling. Perfect for rapid iteration during development.\n\nTo build and test packages locally:\n```bash\n# Build the package\n./build_packages.sh\n\n# Install locally built package\npip install dist/spine-*.whl[all]\n```\n\n## Usage\n\n### Command Line Interface\n\n**Option 1: After installation, use the `spine` command:**\n\n```bash\n# Run training/inference/analysis\nspine --config config/train_uresnet.cfg --source /path/to/data.h5\n```\n\n**Option 2: Run directly from source (development):**\n\n```bash\n# From the spine repository directory\npython src/spine/bin/run.py --config config/train_uresnet.cfg --source /path/to/data.h5\n```\n\n### Python API\n\nBasic example:\n```python\n# Necessary imports\nimport yaml\nfrom spine.driver import Driver\n\n# Load configuration file  \ncfg_path = 'config/train_uresnet.cfg'  # or your config file\nwith open(cfg_path, 'r') as f:\n    cfg = yaml.safe_load(f)\n\n# Initialize driver class\ndriver = Driver(cfg)\n\n# Execute model following the configuration regimen\ndriver.run()\n```\n\n* Documentation is available at https://spine.readthedocs.io/latest/.\n* Tutorials and examples can be found in the documentation.\n\n### Example Configuration Files\n\nExample configurations are available in the `config` folder:\n\n| Configuration name            | Model          |\n| ------------------------------|----------------|\n| `train_uresnet.cfg`           | UResNet alone  |\n| `train_uresnet_ppn.cfg`       | UResNet + PPN  |\n| `train_graph_spice.cfg`       | GraphSpice     |\n| `train_grappa_shower.cfg`     | GrapPA for shower fragments clustering |\n| `train_grappa_track.cfg`      | GrapPA for track fragments clustering |\n| `train_grappa_inter.cfg`      | GrapPA for interaction clustering |\n\nTo switch from training to inference mode, set `trainval.train: False` in your configuration file.\n\nKey configuration parameters you may want to modify:\n* `batch_size` - batch size for training/inference\n* `weight_prefix` - directory to save model checkpoints\n* `log_dir` - directory to save training logs\n* `iterations` - number of training iterations\n* `model_path` - path to checkpoint to load (optional)\n* `train` - boolean flag for training vs inference mode\n* `gpus` - GPU IDs to use (leave empty '' for CPU)\n\n\nFor more information on storing analysis outputs and running custom analysis scripts, see the documentation on `outputs` (formatters) and `analysis` (scripts) configurations.\n\n### Running A Configuration File\n\nBasic usage with the `spine` command:\n```bash\n# Run training/inference directly\nspine --config config/train_uresnet.cfg --source /path/to/data.h5\n\n# Or run in background with logging\nnohup spine --config config/train_uresnet.cfg --source /path/to/data.h5 \u003e log_uresnet.txt 2\u003e\u00261 \u0026\n```\n\nYou can load a configuration file into a Python dictionary using:\n```python\nimport yaml\n# Load configuration file\nwith open('config/train_uresnet.cfg', 'r') as f:\n    cfg = yaml.safe_load(f)\n```\n\n### Reading a Log\n\nA quick example of how to read a training log, and plot something\n```python\nimport pandas as pd\nimport matplotlib.pyplot as plt\nfname = 'path/to/log.csv'\ndf = pd.read_csv(fname)\n\n# plot moving average of accuracy over 10 iterations\ndf.accuracy.rolling(10, min_periods=1).mean().plot()\nplt.ylabel(\"accuracy\")\nplt.xlabel(\"iteration\")\nplt.title(\"moving average of accuracy\")\nplt.show()\n\n# list all column names\nprint(df.columns.values)\n```\n\n### Recording network output or running analysis\nDocumentation for analysis tools and output formatting is available in the main documentation at https://spine.readthedocs.io/latest/.\n\n## Repository Structure\n* `bin` contains utility scripts for data processing\n* `config` has example configuration files\n* `docs` contains documentation source files  \n* `src/spine` contains the main package code\n* `test` contains unit tests using pytest\n\nPlease consult the documentation for detailed information about each component.\n\n## Testing and Coverage\n\n### Running Tests\n\nThe SPINE package includes comprehensive unit tests using pytest:\n\n```bash\n# Run all tests\npytest\n\n# Run tests for a specific module\npytest test/test_data/\n\n# Run with verbose output\npytest -v\n```\n\n### Checking Test Coverage\n\nTest coverage tracking helps ensure code quality and identify untested areas. Coverage reports are automatically generated in our CI pipeline and uploaded to [Codecov](https://codecov.io/gh/DeepLearnPhysics/spine).\n\nTo check coverage locally:\n\n```bash\n# Run the coverage script (generates terminal, HTML, and XML reports)\n./bin/coverage.sh\n\n# Or run pytest with coverage flags directly\npytest --cov=spine --cov-report=term --cov-report=html\n\n# View the HTML report\nopen htmlcov/index.html\n```\n\nThe coverage configuration is defined in `pyproject.toml` under `[tool.coverage.run]` and `[tool.coverage.report]`.\n\n## Contributing\n\nBefore you start contributing to the code, please see the [contribution guidelines](CONTRIBUTING.md).\n\n### Adding a new model\n\nThe SPINE framework is designed to be extensible. To add a new model:\n\n1. **Data Loading**: Parsers exist for various sparse tensor and particle outputs in `spine.io.core.parse`. If you need fundamentally different data formats, you may need to add new parsers or collation functions.\n\n2. **Model Implementation**: Add your model to the `spine.model` package. Include your model in the factory dictionary in `spine.model.factories` so it can be found by the configuration system.\n\n3. **Configuration**: Create a configuration file in the `config/` folder that specifies your model architecture and training parameters.\n\nOnce these steps are complete, you should be able to train your model using the standard SPINE workflow.\n","funding_links":[],"categories":[],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fdeeplearnphysics%2Fspine","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fdeeplearnphysics%2Fspine","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fdeeplearnphysics%2Fspine/lists"}