An open API service indexing awesome lists of open source software.

https://github.com/bentoml/pneumonia-detection-demo

Pneumonia Detection - Healthcare Imaging Application built with BentoML and fine-tuned Vision Transformer (ViT) model
https://github.com/bentoml/pneumonia-detection-demo

computer-vision health-care-application healthcare healthcare-imaging transformer

Last synced: 5 months ago
JSON representation

Pneumonia Detection - Healthcare Imaging Application built with BentoML and fine-tuned Vision Transformer (ViT) model

Awesome Lists containing this project

README

          


Pneumonia Detection with BentoML




Healthcare AI 🫁🔍- Made Easy with BentoML

Powered by BentoML 🍱 + HuggingFace 🤗




## 📖 Introduction 📖
In this project, we showcase the seamless integration of an image detection model into a service using BentoML. Leveraging the power of the pretrained `nickmuchi/vit-finetuned-chest-xray-pneumonia model` from HuggingFace, users can submit their lung X-ray images for analysis. The model will then determine, with precision, whether the individual has pneumonia or not.

📝 **Disclaimer: Please note that this project is not intended to replace professional medical advice. It is designed purely for demonstration and testing purposes. Always consult with a qualified healthcare professional for a proper diagnosis.**

| Normal | Pneumonia |
|------- |----------------------------------------- |
| ![Normal](samples/NORMAL2-IM-1427-0001.jpeg)| ![Pneumonia](samples/person1950_bacteria_4881.jpeg) |

## 🏃‍♂️ Running the Service 🏃‍♂️
### BentoML CLI
Clone the repository and install the dependencies:
```bash
git clone https://github.com/bentoml/Pneumonia-Detection-demo.git && cd Pneumonia-Detection-demo

pip install -r requirements/pypi.txt
```

To serve the model with BentoML:
```
bentoml serve
```

You can then open your browser at http://127.0.0.1:3000 and interact with the service through Swagger UI.

### Containers
We provide two pre-built containers optimized for CPU and GPU usage, respectively.

To run the service, you'll need a container engine such as Docker, Podman, etc. Quickly test the service by running the appropriate container:

```bash
# cpu
docker run -p 3000:3000 ghcr.io/bentoml/pneumonia-detection-demo:cpu

# gpu
docker run --gpus all -p 3000:3000 ghcr.io/bentoml/pneumonia-detection-demo:gpu
```

## 🌐 Interacting with the Service 🌐
BentoML's default model serving method is through an HTTP server. In this section, we demonstrate various ways to interact with the service:
### cURL
```bash
curl -X 'POST' \
'http://localhost:3000/v1/classify' \
-H 'accept: application/json' \
-H 'Content-Type: image/mpo' \
--data-binary '@path-to-image'
```
> Replace `path-to-image` with the file path of the image you want to send to the service.

The response look like:
```json
{"class_name":"NORMAL"}
```
### Via BentoClient 🐍
To send requests in Python, one can use ``bentoml.client.Client`` to send requests to the service. Check out `client.py` for the example code.

### Swagger UI
You can use Swagger UI to quickly explore the available endpoints of any BentoML service.

## 🚀 Deploying to Production 🚀
Effortlessly transition your project into a production-ready application using [BentoCloud](https://www.bentoml.com/bento-cloud/), the production-ready platform for managing and deploying machine learning models.

Start by creating a BentoCloud account. Once you've signed up, log in to your BentoCloud account using the command:

```bash
bentoml cloud login --api-token --endpoint
```
> Note: Replace `` and `` with your specific API token and the BentoCloud endpoint respectively.

Next, build your BentoML service using the `build` command:

```bash
bentoml build
```

Then, push your freshly-built Bento service to BentoCloud using the `push` command:

```bash
bentoml push
```

Lastly, deploy this application to BentoCloud with a single `bentoml deployment create` command following the [deployment instructions](https://docs.bentoml.org/en/latest/reference/cli.html#bentoml-deployment-create).

BentoML offers a number of options for deploying and hosting online ML services into production, learn more at [Deploying a Bento](https://docs.bentoml.org/en/latest/concepts/deploy.html).

## 👥 Community 👥
BentoML has a thriving open source community where thousands of ML/AI practitioners are
contributing to the project, helping other users and discussing the future of AI. 👉 [Pop into our Slack community!](https://l.bentoml.com/join-slack)