Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/kunalshelke90/xray_image_classification
This project classifies chest X-ray images into Pneumonia and Normal using a CNN model. It includes deployment via Streamlit, enabling interactive web-based predictions and real-time analysis of X-ray images.
https://github.com/kunalshelke90/xray_image_classification
classification computer-vision data-augmentation deep-learning docker python streamlit
Last synced: about 1 month ago
JSON representation
This project classifies chest X-ray images into Pneumonia and Normal using a CNN model. It includes deployment via Streamlit, enabling interactive web-based predictions and real-time analysis of X-ray images.
- Host: GitHub
- URL: https://github.com/kunalshelke90/xray_image_classification
- Owner: kunalshelke90
- License: mit
- Created: 2024-09-08T17:13:31.000Z (2 months ago)
- Default Branch: main
- Last Pushed: 2024-09-14T17:46:02.000Z (2 months ago)
- Last Synced: 2024-10-12T23:42:15.410Z (about 1 month ago)
- Topics: classification, computer-vision, data-augmentation, deep-learning, docker, python, streamlit
- Language: Jupyter Notebook
- Homepage:
- Size: 1.59 MB
- Stars: 0
- Watchers: 3
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
## Lungs Xray Classification with deep learning
In this project, I developed an X-ray lung classification model using deep learning techniques to detect lung abnormalities. Leveraging the PyTorch framework, I trained a convolutional neural network (CNN) to classify chest X-rays as either healthy or showing signs of lung disease. The model was then integrated into a web API for real-time predictions and deployed using Docker for scalability and accessibility.
## worflow
- constants : Defines global constants used across the project, such as paths, configurations, and hyperparameters.
- config_entity : Contains configuration classes that manage various settings required for different stages of the workflow.
- artifact_entity : Defines classes to represent artifacts generated at each stage of the machine learning pipeline.
- components : Houses core modules for data ingestion, model training, evaluation, and prediction components.
- pipeline : Orchestrates the entire workflow by connecting various components and executing the machine learning pipeline.
- main : Entry point of the project that triggers the execution of the pipeline, handling end-to-end processing.## How to setup
```bash
conda create -p env python=3.8 -y
```
```bash
conda activate env
```
```bash
git clone https://github.com/kunalshelke90/Xray_Image_Classification.git```
```bash
cd Xray_Image_Classification```
```bash
pip install -r requirements.txt
```
- setup AWS CLI```bash
https://docs.aws.amazon.com/cli/latest/userguide/getting-started-install.html
```
```bash
aws configure
```- setup this information
```bash
AWS Access Key ID=
AWS Secret Access Key=
Default region name=
```
- to run the code with streamlit```bash
python app.py
```