Ecosyste.ms: Awesome

An open API service indexing awesome lists of open source software.

Awesome Lists | Featured Topics | Projects

https://github.com/biomedsciai/dpm360

Repository for Disease Progression Modeling workbench 360 - An end-to-end deep learning model training framework in python on OMOP data
https://github.com/biomedsciai/dpm360

deep-learning healthcare machine-learning ohdsi omop python pytorch sklearn

Last synced: 2 months ago
JSON representation

Repository for Disease Progression Modeling workbench 360 - An end-to-end deep learning model training framework in python on OMOP data

Awesome Lists containing this project

README

        

# DPM360 [![Downloads](https://pepy.tech/badge/dpm360-lightsaber)](https://pepy.tech/project/dpm360-lightsaber)
Repository for Disease Progression Modeling workbench 360 - An end-to-end deep learning model training framework in python on OHDSI-OMOP data

Overview and YouTube demonstration are available [here](https://biomedsciai.github.io/DPM360/). License, Contribution, Publications are also available there.

DPM360 Component View

# Installation Guides

DPM360 components are interoperable but can also work as independent tools. DPM360 is typically installed over a cluster that sets up a number of interconnected micro-services. We can broadly divide the components into two groups viz (i) components that are concerned with micro service setups and (ii) standalone python packages providing core functional capabilities, having separate installation procedures. Please see the guides below to install each component.

## DPM360 micro-service utilities

One of the key micro-service utilties is installer that sets up an OHDSI stack (Atlas, WebAPI, a Postgres Database, and Achilles) into a cloud cluster such as Kubernetes or OpenShift. See [installation guide](installer/docs/installer.md) for details. Its [Express Installation Script section](installer/docs/installer.md#express-installation-script) provides minimum setup operations. You also follow [**non-cloud-cluster setup**](installer/docs/non_cluster_install.md) if you want to try OHDSI stack without using a cluster. Using this component:
- run a OMOP CDM database using Postgres on your cloud cluster
- run Atlas, WebAPI and other OHDSI service with the DB
- run Model Registry using MLFlow where your learned models are registered

The service builder component packages and deploys the learned models to the target cloud cluster. See [installation guide](service_builder/docs/README.md) for details. Using this component:
- make a microservice by deploying the model registered in Model Registry using KFServing
- test and interact with the deployed model microservice via a Swagger based interface

## DPM360 standalone python packages - enabling deep learning model training in python on OHDSI-OMOP data

The [lightsaber](lightsaber/docs/index.md) component is an extensible Python training framework which provides blueprints for the development of disease progression models. See [installation guide](lightsaber/docs/install.md). Also see [user guide](lightsaber/docs/user_guide.md) for data loading and training details. Using this component:

- develop machine learning models using extensible data loaders and training pipelines
- use extensible data loaders designed for time-series dataset extracted from OHDSI and other EMRs
- use scikit-learn and PyTorch Lightning based training pipelines with pre-defined networks and loss functions for processing time-series dataset.
- save and register your learned models with experimental artifacts in Model Registry

The [cohort tools](cohort_tools/docs/index.md) component provides python scripts to extract features from cohorts defined via ATLAS or custom queries. It enables [integration with lightsaber](cohort_tools/docs/user_guide.md) to use features extracted from OHDSI databases. Using this component:
- with cohort defined by Atlas and others, define and extract the features from the OMOP CDM database
- extract such features in CVS files and use them as inputs for lightsaber