Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/dpys/pynets
A Reproducible Workflow for Structural and Functional Connectome Ensemble Learning
https://github.com/dpys/pynets
brain-connectivity decision-trees dipy dmri ensemble-learning ensemble-sampling fmri fuzzy-logic graph-neural-networks gridsearch multiverse networks networkx nilearn nipype optimization tractography workflow
Last synced: 8 days ago
JSON representation
A Reproducible Workflow for Structural and Functional Connectome Ensemble Learning
- Host: GitHub
- URL: https://github.com/dpys/pynets
- Owner: dPys
- Created: 2017-03-29T14:23:40.000Z (over 7 years ago)
- Default Branch: master
- Last Pushed: 2024-02-01T21:18:32.000Z (10 months ago)
- Last Synced: 2024-11-03T08:50:07.790Z (17 days ago)
- Topics: brain-connectivity, decision-trees, dipy, dmri, ensemble-learning, ensemble-sampling, fmri, fuzzy-logic, graph-neural-networks, gridsearch, multiverse, networks, networkx, nilearn, nipype, optimization, tractography, workflow
- Language: Python
- Homepage: https://pynets.readthedocs.io/en/latest/
- Size: 1010 MB
- Stars: 121
- Watchers: 9
- Forks: 41
- Open Issues: 2
-
Metadata Files:
- Readme: README.md
- Contributing: CONTRIBUTING.rst
- License: licenses/AGPL-3.0.txt
Awesome Lists containing this project
README
PyNets®
=======
[![CircleCI](https://circleci.com/gh/dPys/PyNets.svg?style=svg)](https://circleci.com/gh/dPys/PyNets)
[![codecov](https://codecov.io/gh/dPys/PyNets/branch/master/graph/badge.svg)](https://codecov.io/gh/dPys/PyNets?branch=master)
[![PyPI - Version](https://img.shields.io/pypi/v/omniduct.svg)](https://pypi.org/project/pynets/)
![PyPI - Python Version](https://img.shields.io/pypi/pyversions/pynets.svg)
[![License: AGPL v3](https://img.shields.io/badge/License-AGPL_v3-blue.svg)](https://www.gnu.org/licenses/agpl-3.0)
[![https://www.singularity-hub.org/static/img/hosted-singularity--hub-%23e32929.svg](https://www.singularity-hub.org/static/img/hosted-singularity--hub-%23e32929.svg)](https://singularity-hub.org/collections/5228)
[![Docker](https://badgen.net/badge/icon/docker?icon=docker&label)](https://https://docker.com/)
[![brainlife.io/app](https://img.shields.io/badge/brainlife.io-app-green.svg)](https://brainlife.io/app/6041f75166d5ce1daf6efb55)About
-----
PyNets is a tool for sampling and analyzing varieties of individual structural and functional connectomes. Using decision-tree learning, along with extensive bagging and boosting, PyNets is the first application of its kind to facilitate fully-reproducible, parametric sampling of connectome ensembles from neuroimaging data. As a post-processing workflow, PyNets is intended for any preprocessed fMRI or dMRI data in native anatomical space such that it supports normative-referenced connectotyping at the individual-level. Towards these ends, it comprehensively integrates best-practice tractography and functional connectivity analysis methods based open-source libraries such as Dipy and Nilearn, though it is powered primarily through NetworkX and the Nipype workflow engine. PyNets can now also be deployed as a BIDS application, where it takes BIDS derivatives and makes BIDS derivatives.Install
-------
## Dockerhub (preferred):
```
docker pull dpys/pynets
```## Manual
Third-Party Dependencies:
*[FSL](https://fsl.fmrib.ox.ac.uk/fsl/fslwiki/FslInstallation) version >=5.0.9
*[Python3.8+](https://www.python.org/download/releases/3.0/) with GUI programming enabled (See [tkinter](https://docs.python.org/3/library/tkinter.html#module-tkinter))
```
[sudo] pip install pynets [--user]
```
or
```
# Install git-lfs
1. brew install git-lfs (macOS) or [sudo] apt-get install git-lfs (linux) or skip to step 2 (Windows)
2. git lfs install --skip-repo# Clone the repository and install
git clone https://github.com/dpys/pynets
cd PyNets
[sudo] python setup.py install [--user]
```Hardware Requirements
---------------------
4+ vCPU's, 8+ GB RSS memory, and at least 10 GB of free disk space though
storage needs can vary considerably depending on the size of the input data
and the type of analysis that you wish to run.Operating Systems
-----------------
UNIX/MacOS 64-bit platformsWindows >10 with [WSL2](https://docs.microsoft.com/en-us/windows/wsl/compare-versions#whats-new-in-wsl-2)
Documentation
-------------
Explore official installation instructions, user-guide, API, and examples:Citing
------
A manuscript is in preparation, but for now, please cite all uses with the following entry:
```
@misc{PyNets,
title = {PyNets: A Reproducible Workflow for Structural and Functional Connectome Ensemble Learning},
author = {Pisner, Derek A and Hammonds, Ryan B.},
publisher = {Poster session presented at: Annual Meeting of the Organization for Human Brain Mapping},
url = {https://github.com/dPys/PyNets},
year = {2020},
month = {June}
}
```Data already preprocessed with BIDS apps like fmriprep, CPAC, dmriprep? If your BIDS derivatives can be queried with pybids, then you should be able to run them with the user-friendly `pynets_bids` CLI!
```
pynets_bids '/hnu/fMRIprep/fmriprep' '/Users/dPys/outputs/pynets' participant func --participant_label 0025427 0025428 --session_label 1 2 -config pynets/config/bids_config.json```
*Note: If you preprocessed your BOLD data using fMRIprep, then you will need to have specified either `T1w` or `anat` in the list of fmriprep `--output-spaces`. Similarly, if you preprocessed your data using CPAC, then you will want to be sure that an ALFF image exists. PyNets does NOT currently accept template-normalized BOLD or DWI data. See the usage docs for more information on compatible file types.where the `-config` flag specifies that path to a .json configuration spec that includes at least one of many possible connectome recipes to apply to your data. Pre-built configuration files are available (see: ), and an example is shown here (with commented descriptions):
```
{
"func": { # fMRI options. If you only have functional (i.e. BOLD) data, set each of the `dwi` options to "None"
"ct": "None", # Indicates the type(s) of clustering that will be used to generate a clustering-based parcellation. This should be left as "None" if no clustering will be performed, but can be included simultaneously with `-a`.
"k": "None", # Indicates the number of clusters to generate in a clustering-based parcellation. This should be left as "None" if no clustering will be performed.
"hp": "['0', '0.028', '0.080']", # Indicates the high-pass frequenc(ies) to apply to signal extraction from nodes.
"mod": "['partcorr', 'cov']", # Indicates the functional connectivity estimator(s) to use. At least 1 is required for functional connectometry.
"sm": "['0', '4']", # Indicates the smoothing FWHM value(s) to apply during the nodal time-series signal extraction.
"es": "['mean', 'median']" # Indicates the method(s) of nodal time-series signal extraction.
},
"dwi": { # dMRI options. If you only have structural (i.e. DWI) data, set each of the `func` options to "None"
"dg": "det", # The traversal method of tractography (e.g. deterministic, probabilistic)
"ml": "40", # The minimum length criterion for streamlines in tractography
"mod": "csd", # The diffusion model type
"em": "8" # The tolerance distance (in the units of the streamlines, usually mm). If any node in the streamline is within this distance from the center of any voxel in the ROI, then the connection is counted as an edge"
},
"gen": { # These are general options that apply to all modalities
"a": "['BrainnetomeAtlasFan2016', 'atlas_harvard_oxford', 'destrieux2009_rois']", # Anatomical atlases to define nodes.
"bin": "False", # Binarize the resulting connectome graph before analyzing it. Note that undirected weighted graphs are analyzed by default.
"embed": "False", # Activate any of several available graph embedding methods
"mplx": 0, # If both functional and structural data are provided, this parameter [0-3] indicates the type of multiplex connectome modeling to perform. See `pynets -h` for more details on multiplex modes.
"n": "['Cont', 'Default']", # Which, if any, Yeo-7/17 resting-state sub-networks to select from the given parcellation. If multiple are specified, all other options will iterate across each.
"norm": "['6']", # Level of normalization to apply to graph (e.g. standardize betwee 0-1, Pass-to-Ranks (PTR), log10).
"spheres": "False", # Use spheres as nodes (vs. parcel labels, the default).
"ns": "None", # If `spheres` is True, this indicates integer radius size(s) of spherical centroid nodes.
"p": "['1']", # Apply anti-fragmentation, largest connected-component subgraph selection, or any of a variety of hub-detection methods to graph(s).
"plt": "False", # Activate plotting (adjacency matrix and glass-brain included by default).
"thr": 1.0, # A threshold (0.0-1.0). This can be left as "None" if multi-thresholding is used.
"max_thr": 0.80, # If performing multi-thresholding, a minimum threshold.
"min_thr": 0.20, # If performing multi-thresholding, a maximum threshold.
"step_thr": 0.10, # If performing multi-thresholding, a threshold interval size.
"dt": "False", # Global thresholding to achieve a target density. (Only one of `mst`, `dt`, and `df` can be used).
"mst": "True", # Local thresholding using the Minimum-Spanning Tree approach. (Only one of `mst`, `dt`, and `df` can be used).
"df": "False", # Local thresholding using a disparity filter. (Only one of `mst`, `dt`, and `df` can be used).
"vox": "'2mm'" # Voxel size (1mm or 2mm). 2mm is the default.
}
}
```Data not in BIDS format and/or preprocessed using in-house tools?
No problem-- you can still run pynets manually:
```
pynets -id '002_1' '/Users/dPys/outputs/pynets' \ # where `-id` is an arbitrary subject identifier and the first path is an arbitrary output directory to store derivatives of the workflow.
-func '/Users/dPys/PyNets/tests/examples/sub-002/ses-1/func/BOLD_PREPROCESSED_IN_ANAT_NATIVE.nii.gz' \ # The fMRI BOLD image data.
-anat '/Users/dPys/PyNets/tests/examples/sub-002/ses-1/anat/ANAT_PREPROCESSED_NATIVE.nii.gz' \ # The T1w anatomical image. This is mandatory -- PyNets requires a T1/T2-weighted anatomical image unless you are analyzing raw graphs that ahve already been produced.
-a 'BrainnetomeAtlasFan2016' \ # An anatomical atlas name. Note that if were to omit the `-a` flag, a custom parcellation file would need to be specified using the `-a` flag instead or a valid clustering mask (`-cm`) would be needed to generate an individual parcellation. For a complete catalogue of anatomical atlases available in PyNets, see the `Usage` section of the documentation.
-mod 'partcorr' \ # The connectivity model. In the case of structural connectometry, this becomes the diffusion model type.
-thr 0.20 \ # Optionally apply a single proportional threshold to the generated graph.
``````
pynets -id '002_1' '/Users/dPys/outputs/pynets' \ # where `-id` is an arbitrary subject identifier and the first path is an arbitrary output directory to store derivatives of the workflow.
-dwi '/Users/dPys/PyNets/tests/examples/sub-002/ses-1/dwi/DWI_PREPROCESSED_NATIVE.nii.gz' \ # The dMRI diffusion-weighted image data.
-bval '/Users/dPys/PyNets/tests/examples/sub-002/ses-1/dwi/BVAL.bval' \ # The b-values.
-bvec '/Users/dPys/PyNets/tests/examples/sub-002/ses-1/dwi/BVEC.bvec' \ # The b-vectors.
-anat '/Users/dPys/PyNets/tests/examples/sub-002/ses-1/anat/ANAT_PREPROCESSED_NATIVE.nii.gz' \ # The T1w anatomical image.
-a '/Users/dPys/.atlases/MyCustomParcellation-scale1.nii.gz' '/Users/dPys/.atlases/MyCustomParcellation-scale2.nii.gz' \ # The parcellations.
-mod 'csd' 'csa' 'sfm' \ # The (diffusion) connectivity model(s).
-dg 'prob' 'det' \ # The tractography traversal method.
-mst -min_thr 0.20 -max_thr 0.80 -step_thr 0.10 # Multi-thresholding from the Minimum-Spanning Tree, with AUC graph analysis.
-n 'Default' # The resting-state network definition to restrict node-making.
```![Multiplex Layers](docs/_static/multiplex.png)
![Multiplex Glass](docs/_static/glassbrain.png)
![Ensemble Connectome](docs/_static/Omnetome_mat.png)
![Yeo7](docs/_static/yeo7_mosaic.png)
![ICC](docs/_static/NodeWiseICC.png)
![Workflow DAG](docs/_static/Workflow.png)