https://github.com/aryansingla45/ml-pipelines-studentmodel
This project implements different pipelines for different machine learning steps like data ingestion , data transformation , model training and predicting data using model. Also it uses a Flask app to host it on the local machine to predict the values.
https://github.com/aryansingla45/ml-pipelines-studentmodel
aws continous-integration continuous-deployment flask machine-learning pipeline python
Last synced: about 1 month ago
JSON representation
This project implements different pipelines for different machine learning steps like data ingestion , data transformation , model training and predicting data using model. Also it uses a Flask app to host it on the local machine to predict the values.
- Host: GitHub
- URL: https://github.com/aryansingla45/ml-pipelines-studentmodel
- Owner: aryansingla45
- License: cc0-1.0
- Created: 2024-05-22T18:39:55.000Z (11 months ago)
- Default Branch: main
- Last Pushed: 2024-07-12T06:18:05.000Z (10 months ago)
- Last Synced: 2025-02-02T02:34:28.651Z (3 months ago)
- Topics: aws, continous-integration, continuous-deployment, flask, machine-learning, pipeline, python
- Language: Python
- Homepage:
- Size: 1.05 MB
- Stars: 1
- Watchers: 2
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
[](https://github.com/aryansingla45/ML-Pipelines-StudentModel/actions/workflows/main.yml)
# ML-Pipelines-StudentModel
This project is used for hosting a web app using flask to predict a student math score using different details.
The project also has different pipelines for data ingestion, transformation, model training and predicting which help in seamless predicting the data.### Running the project
1. Setting up the Virtual Environment
To run these commands first set a virtual environment so there is no environment related errors.
```conda create -p venv```
```conda activate venv/```2. Install all the dependencies and requirements:
Run this command to install all the requirements that are used in this code.
```pip install -r requirements.txt```3. Data Ingestion :
```python src/components/data_ingestion.py```
4. Data Transformation:
`python src/components/data_transformation.py`
5. Model Training:
`python src/components/model_training.py`
If you want to run all the pipelines together then run the data ingestion pipeline if not then you can edit the code and comment it out.6. Running the Flask App
`python app.py`7. Running the Streamlit App
`streamlit run application.py`