Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/sujith-kamme/cross-sell-insight
Creating a predictive machine learning model to determine customer receptiveness to cross-sell pitches
https://github.com/sujith-kamme/cross-sell-insight
aws-ec2 aws-s3 docker github-actions machine-learning
Last synced: 2 days ago
JSON representation
Creating a predictive machine learning model to determine customer receptiveness to cross-sell pitches
- Host: GitHub
- URL: https://github.com/sujith-kamme/cross-sell-insight
- Owner: sujith-kamme
- License: mit
- Created: 2024-01-13T23:38:46.000Z (12 months ago)
- Default Branch: main
- Last Pushed: 2024-01-24T23:22:19.000Z (12 months ago)
- Last Synced: 2024-11-10T07:16:49.194Z (about 2 months ago)
- Topics: aws-ec2, aws-s3, docker, github-actions, machine-learning
- Language: HTML
- Homepage:
- Size: 1.92 MB
- Stars: 1
- Watchers: 2
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# Cross-Sell-Insight
Creating a predictive machine learning model to determine customer receptiveness to cross-sell pitches.
![UI](https://github.com/sujith-kamme/Cross-Sell-Insight/assets/142932988/e062a15e-8bcc-48d4-bb18-b7563a3a781f)## **Coding Workflow:**
### Project Setup
1. Virtual environment (venv) created.
2. Automated project file structure creation - `template.py`.
3. Updated `requirements.txt` and `setup.py`.
4. Installed packages in venv
```bash
pip install -r requirements.txt
```### **Pipelines Implementation**
1. Implemented custom logging functionality for the project under `src/code/logging/__init__.py`.
2. Implemented common utility functions under `utils/common.py`.**Data Ingestion and Validation**
1. Updated `config.yaml`.
2. Updated `schema.yaml`.
3. Updated `constants/__init__.py`.
4. Updated `entity/entityconfig.py`.
5. Updated `src/code/config/config.py`.
6. Updated `components/data_ingestion.py`.
7. Updated `components/data_validation.py`.
8. Updated `pipeline/stage1_data_ingestion_and_validation.py`.
9. Updated `main.py` file.**Data Transformation**
1. Updated `config.yaml`.
2. Updated `schema.yaml`.
3. Updated `entity/entityconfig.py`.
4. Updated `src/code/config/config.py`.
5. Updated `components/data_transformation.py`.
7. Updated `pipeline/stage2_data_transformation.py`.
8. Updated `main.py` file.**Model Training**
1. Updated `config.yaml`.
2. Updated `entity/entityconfig.py`.
3. Updated `src/code/config/config.py`.
4. Updated `components/model_training.py`.
5. Updated `pipeline/stage3_model_training.py`.
6. Updated `main.py` file.
7. Updated `param.yaml`**Setup remote access to enable collaboration** - with DagsHub(MLflow)
1. Experiment Tracking
2. Upload model to model registry
3. Model serving**Model Evaluation**
1. Updated `config.yaml`.
2. Updated `entity/entityconfig.py`.
3. Updated `src/code/config/config.py`.
4. Updated `components/model_evaluation.py`.
5. Updated `pipeline/stage4_model_evaluation.py`.
6. Setting up MLFlow Tracking URI to enable collaboration
Run this to export as env variables:```bash
export MLFLOW_TRACKING_URI= "https://dagshub.com/sujith-kamme/Cross-Sell-Insight.mlflow"
export MLFLOW_TRACKING_USERNAME= "sujith-kamme"
```
7. Updated `main.py` file.## **AWS-CICD-Deployment-with-Github-Actions**
### 1. Login to AWS console.
### 2. Create IAM user for deployment
#with specific access
1. EC2 access : It is virtual machine
2. ECR: Elastic Container registry to save your docker image in aws
#Description: About the deployment
1. Build docker image of the source code
2. Push your docker image to ECR
3. Launch Your EC2
4. Pull Your image from ECR in EC2
5. Lauch your docker image in EC2
#Policy:
1. AmazonEC2ContainerRegistryFullAccess
2. AmazonEC2FullAccess
### 3. Create ECR repo to store/save docker image
- Save the URI
### 4. Create EC2 machine (Ubuntu)### 5. Open EC2 and Install docker in EC2 Machine:
#optional
sudo apt-get update -y
sudo apt-get upgrade
#required
curl -fsSL https://get.docker.com -o get-docker.sh
sudo sh get-docker.sh
sudo usermod -aG docker ubuntu
newgrp docker
### 6. Configure EC2 as self-hosted runner:Go to Settings>Actions>Runners>New self-hosted runner> choose os> then run the commands one after the other
### 7. Setup github secrets:
Go to settings> Secrets and Variables > New repository secret
AWS_ACCESS_KEY_IDAWS_SECRET_ACCESS_KEY
AWS_REGION = us-east-2
AWS_ECR_LOGIN_URI = (sample)566373416292.dkr.ecr.ap-south-1.amazonaws.com
ECR_REPOSITORY_NAME = simple-app